Carl Jung’s Myth of the West

We’ve been reading Catfalque. This is Peter Kingsley’s most recent take on the Presocratics but this time explored through the life and work of Carl Jung. It is a satisfying read and gives one a sense of the depth that goes missing in many other Jungian views.

However, there was one thing that bothered me. Kingsley kept on insisting on the uniqueness of the West, that Westerners must focus on their own culture instead of looking to the East or elsewhere. For a scholar of the ancient world, this seems simplistic and naive. East and West, as we now know it, is not a distinction ancient people would have made. The Greeks were more concerned with differentiating themselves from Barbarians, including the tribal people of Europe that were to the west and north of their own lands.

Those Presocratics never thought of themselves as Westerners, except in a relative sense in talking about those to the east of them, but certainly not as a monolithic identity. In fact, they were part of a syncretistic tradition that was heavily influenced by the far and near East, often by way of Egypt. Some early Greek thinkers gave credit to African-ruled Egypt as the original source of great art and philosophy. This would be more fully embraced later on in Hellenism. Greek medicine, for example, may have been shaped by Eastern teachings.

We know that many Greeks had traveled East, as had many Easterners traveled to the Greek and Greco-Roman world. This included Buddhists and Hindus. This was true into the period of the Roman Empire when supposedly there was a Buddhist temple on the Sea of Galilee. The North African church father Augustine was originally a Manichaean before he converted to Christianity, and his early faith was an amalgamation of Judaic baptismal cult, Zoroastrianism, and Buddhism. Besides, the Greeks themselves were a wandering people who originated from somewhere else, and throughout their history they kept wandering about.

In following Jung’s own cultural defensiveness, Kingsley argues that we Westerners have to look to our own sacred origins and that there is a danger of doing otherwise. But Kingsley is an American, a culture of a thousand influences. And Jung was a northern European. Like most other supposed ‘Westerners’, neither probably had any ancestral roots in the ancient people of Greece nor the Greco-Roman Gnostics that Jung and Kingsley see as the heirs of the Presocratics.

The Gnostics were essentially the original Christians which formed out of Judaism which in turn was from the Near East. Judeo-Christianity, Gnostic or otherwise, was a foreign introduction to the Greco-Roman world and even more foreign to the far west and north of Europe. If Jung was looking for sacred origins of his own ancestral inheritance, he would’ve been more wise to look to the tribal paganism that was wiped out by the onslaught of Greco-Roman thought and imperialism. Christianization of Europe was a genocidal tragedy. Paganism held on in large parts of Europe into the Middle Ages and some Pagan traditions survived into modernity.

Our criticism isn’t with the respect given to these non-Western influences that took over the West. We are likewise fascinated by the Presocratics and Gnostics. But we feel no need to rationalize that they belong to us nor us to them. They are foreigners, both in space and time. The ancient Greeks were never a single people. As with the Celts and Jews, to be Greek in the ancient world was a very loose and, at times, extensive identity (Ancient Complexity). Many of the famous Greek thinkers technically weren’t ethnically Greek. It’s similar to how the Irish adopted the trade culture of the Celts, even though they are of Basque origins.

So, what is this fear* of the East seen in Jung’s reluctance while in India? And why has Kingsley adopted it? We are typical American mutts with some possible non-European ancestry mixed in, from African to Native American. And we were raised in a hodge-podge of New Age religion with much Eastern thought and practice thrown in. We have no sacred origins, no particular ancestral homeland. Even our European ancestry originated in different parts of Europe, although none from Italy or Greece, much less the Levant. The Presocratics and Gnostics aren’t our people.

So, it doesn’t bother us to seek wisdom wherever we can find it. It doesn’t cause us fear, in the way it did for Jung. He worried about losing himself and, as he had experienced psychotic breaks earlier in his life, it was a genuine concern. He needed a sense of being rooted in a tradition to hold himself together, even if that rootedness was an invented myth. And that doesn’t really bother us. We are still admirers of Jung’s work, as we appreciate Kingsley’s work.

We understand why Jung, having lived through the world war catastrophe that tore apart the Western world, sought a vision of a renewed Western tradition. It may have seemed like a useful and necessary story, but it poses its own dangers. Even if it really was useful then, we question that it is useful now.

* Why didn’t Carl Jung visit Ramana Maharshi after being told by both Zimmer and Brunton?, from Beezone. It has been argued that Carl Jung borrowed his notion of ‘the Self’ from Hinduism, and this notion was key to his own teachings. Maybe this was the fear, that the meeting point between the two cultures would simply overwhelm his own view and overwhelm his own psyche.

Why Is Average Body Temperature Lowering?

Researchers at Stanford University, according to analysis of data going back to the 1800s, found that average body temperature has decreased (Myroslava Protsiv et al, Decreasing human body temperature in the United States since the Industrial Revolution). Other data supports the present lower norm (J.S. Hausmann et al, Using Smartphone Crowdsourcing to Redefine Normal and Febrile Temperatures in Adults: Results from the Feverprints Study).

They considered that increased health and so decreased inflammation could be the cause, but it’s not clear that inflammation overall has decreased. The modern industrial diet of sugar and seed oils is highly inflammatory. Inflammation has been linked to epidemic of diseases of civilization: obesity, diabetes, heart disease, arthritis, depression, schizophrenia, and much else. In some ways, inflammation is worse than it has ever been. That is why, as a society we’ve become obsessed with anti-inflammatories, from aspirin to turmeric.

The authors of the paper, however, offer other data that contradicts their preferred hypothesis: “However, a small study of healthy volunteers from Pakistan—a country with a continued high incidence of tuberculosis and other chronic infections—confirms temperatures more closely approximating the values reported by Wunderlich”. Since these were healthy volunteers, they should not have had higher inflammation from infections, parasites, etc. So, why were their body temperatures higher than is seen among modern Westerners?

It also has been suggested that there are other potential contributing factors. Ambient temperatures are highly controlled and so the body has to do less work in maintaining an even body temperature. Also, people are less physically active than they once were. The more interesting explanation is that the microbiome has been altered, specifically reduced in the number and variety of heat-producing microbes (Nita Jain, A Microbial-Based Explanation for Cooling Human Body Temperatures).

I might see a clue in the Pakistan data. That population is presumably more likely to be following their traditional diet. If so, this would mean they have been less Westernized in their eating habits, which would translate as fewer refined starchy carbs and sugar, along with fewer seed oils high in omega-6 fatty acids. Their diets might in general be more restricted: fewer calories, smaller portions, less snacking, and longer periods between meals. Plus, as this would be an Islamic population, fasting is part of their religious tradition.

This might point to more time spent in and near ketosis. It might be noted that ketosis is also anti-inflammatory. So why the higher body temperature? Well, there is the microbiome issue. A population on a traditional diet combined with less antibiotic usage would likely still be supporting a larger microbiome. By the way, ketosis is one of the factors that supports a different kind of microbiome, related to its use as treatment for epilspsy (Rachael Rettner, How the Keto Diet Helps Prevent Seizures: Gut Bacteria May Be Key). And ketosis raises the basil metabolic rate which in turn raises temperature. Even though fasting lowers body temperature in the short term, if it was part of an overall ketogenic diet it would help promote on average higher body temperatures.

This is indicated by the research on other animals: “An increased resistance to cold assessed by the rate of fall in body tem-perature in the animals as well as human beings on a high-fat diet has been reported by LEBLANC (1957) and MITCHELL et al. (1946), respectively. LEBLANC (1957) suggested that the large amount of fat accumulated in animals fed a high-fat diet could not explain, either as a source of energy reserves or as an insulator, the superiority of high-fat diet in a cold environment, postulating some changes induced by a high-fat diet in the organism that permits higher sustained rate of heat production in the cold.” (Akihiro Kuroshima, Effects of Cold Adaptation and High-Fat Diet On Cold Resistance and Metabolic REsponses To Acute Exposure In Rats).

And: “Rats on a corn oil diet convert less T4 to active T3 than rats on a lard diet. Rats on a safflower oil diet have a more greatly reduced metabolic response to T3 than rats on a beef fat diet. Rats on a high-PUFA diet have brown fat that’s less responsive to thyroid hormone. Remember, brown fat is the type that generates heat to keep us warm. Rats on a long-term diet high in soybean oil have terrible body temperature regulation, which thyroid function in large part controls” (Mark Sisson, Is Keto Bad For Your Thyroid?). A 1946 study found that a high-fat diet had less of a drop in body temperature in response to cold (H.H. Mitchell, The tolerance of man to cold as affected by dietary modification; carbohydrate versus fat and the effect of the frequency of meals).

Specifically about ketosis, in mice it increases energy expenditure and causes brown fat to produce more heat (Shireesh Srivastava, A Ketogenic Diet Increases Brown Adipose Tissue Mitochondrial Proteins and UCP1 Levels in Mice). Other studies confirm this and some show an increase of brown fat. Brown fat is what keeps us warm. Babies have a lot of it and, in the past, it was thought adults lost it, but it turns out that we maintain brown fat throughout our lives. It’s just that diets have different affects on it.

Bikman points out the relationship between insulin and ketones — when one is high the other is low. Insulin tells the body to slow down metabolism and store energy, that is to say produce fat and to shut down the activity of brown fat. Ketones do the opposite, not only activating brown fat but causing white fat to act more like brown fat. This is what causes the metabolic advantage of the keto diet, in losing excess body fat and maintaining lower weight, as it increases the burning of 200-300 calories per day (metabolizing 10 lbs of body fat a year). By the way, cold exposure and exercise also activate brown fat, which goes back to general lifestyle factors that go hand in hand with diet.

Some people attest to feeling warmer in winter while in ketosis (Ketogenic Forums, Ketosis, IF, brown fat, and being warmer in cool weather), although others claim to not handle cold well which might simply be an issue of how quickly people become fully fat-adapted. A lifetime of a high-carb diet changes the body. But other than permanently damaged biological functioning, the body should be able to eventually shift into more effective ketosis and hence thermogenesis.

In humans, there is an evolutionary explanation for this. And humans indeed are unique in being able to more easily enter and remain in ketosis. But think about when ketosis most often happened in the past and you’ll understand why it seems to be inefficient in wasting energy as heat, what is a slight metabolic advantage if you’re trying to lose weight. For most of human existence, carb restriction was forced upon the species during the coldest season when starchy plants don’t grow. That is key.

It was an advantage to not only be able to survive off of one’s own body fa but to simultaneously create extra heat, especially during enforced fasting when food supplies were low as fasting would tend to drop body temperature — an argument made by the insulin researcher Benjamin Bikman (see 9/9/17 interview with Mike Mutzel on High Intensity Health at 20:34 mark, Insulin, Brown Fat & Ketones w/ Benjamin Bikman, PhD; & see Human Adaptability and Health). Ketosis is a compensatory function for survival during the harshest time of the year, winter.

Maybe modern Westerners have lower body temperature for the same reason they are plagued with diseases of civilization, specifically those having to do with metabolic syndrome and insulin resistance. If we didn’t take so many drugs and other substances to manage inflammation, maybe our body temperature would be higher. But it’s possible the lack of ketosis by itself might be enough to significantly keep it at a reduced level. And if not ketosis, something else about the diet and metabolism likely are involved.

* * *

What is the relevance? Does it matter that average body temperature has changed? As I pointed out above, it could indicate how the entirety of physiological functioning has been altered. A major component has to do with the metabolism which relates to diet, gut health, and microbiome. About the latter, Nita Jain wrote that,

“A 2010 report observed that 36.7° C may be the ideal temperature to ward off fungal infection whilst maintaining metabolism. In other words, high body temperatures represent optimization in the tradeoff between metabolic expenditure and resistance to infectious diseases. Our reduced exposure to potentially pathogenic fungi in developed countries may therefore be another possible factor driving changes in human physiology” (A Microbial-Based Explanation for Cooling Human Body Temperatures).

That would be significant indeed. And it would be far from limited to fungal infections: “In general, a ketogenic diet is useful for treating bacterial and viral infections, because bacteria and viruses don’t have mitochondria, so a ketogenic diet starves them of their favorite fuel source, glucose” (Paleo Leap, Infections and Chronic Disorders). Ketosis, in being anti-inflammatory, has been used to treat gout and autoimmune disorders, along with mood disorders that often include brain inflammation.

The inflammatory pathway, of course, is closely linked to the immune system. Reducing inflammation is part of complex processes in the body. Opposite of the keto diet, a high-carb diet produces inflammatory markers that suppress the immune system and so compromises prevention of and healing from infections. Indeed, obese and diabetic patients are hospitalized more often and get worse symptoms of influenza infections (flu).

But it’s not merely the reduction of inflammation. As an energy source, ketones are preferred over glucose by immune cells that fight infections, although maybe some bacteria can use ketones. It’s a similar pattern with cancer, in which ketosis can help prevent some cancers from growing in the early stages, but the danger is once established particular kinds of cancers can adapt to using ketones. So, it isn’t as simple as ketosis curing everything, even if it is a overall effective preventative measure in maintaining immunological health and general health.

What interests us most here are infections. Let’s look further at the flu. One study gave mice an influenza infection (Emily L. Goldberg et al, Ketogenic diet activates protective γδ T cell responses against influenza virus infection; Abby Olena, Keto Diet Protects Mice from Flu). The mice were on different diets. All of those on standard chow died, but half survived on the keto diet. To determine causes, other mice were put on a diet high in both fat and carbs while others were given exogenous ketones, but these mice also died. It wasn’t only the fat or the ketones in the keto diet. Something about fat metabolism seems to have been key, that is to say not only fat and not only the ketones but something about how fat is turned into ketones during ketosis, although some speculate that protein restriction might have been important.

The researchers were able to pinpoint the mechanisms for fighting off the infection. Turning fat into ketones allows the gamma delta subset of T cells in the lungs to be activated in response to influenza. This was unexpected as they haven’t been a focus in previous research. These T cells increase mucus production in epithelial cells in the lungs. This creates a protective barrier that traps the virus and allows it to be coughed up. At the same time, the keto diet blocks the production of inflammasones, multiunit protein complexes activated by the immune system. This reduces the inflammation that can harm the lungs. This relates to the T cell stimulation.

From an anecdotal perspective, here is an interesting account: “I have been undergoing a metabolic reset to begin the year. I have been low carb/keto on and off for the last 4.5 years and hop in and out of ketosis for short periods of time when it benefits me or when my body is telling me I need to. Right now, I decided to spend the first 6 weeks of 2018 in ketosis. I check my numbers every morning and have consistently been between 1.2 and 2.2 mmol/L. I contracted a virus two days ago (it was not influenza but I caught something) and my ketone levels shot through the roof. Yesterday morning I was at 5.2 (first morning of being sick) and this morning I was at 5.8 (although now I am in a fasted state as I have decided to fast through this virus.)” (bluesy2, Keto Levels with Virus/Flu).

Maybe that is a normal response for someone in ketosis. The mouse study suggests there is something about the process itself in producing ketones that is involved in the T cell stimulation. The ketones also might have a benefit for other reasons, but the process of fat oxidation or something related to it might be the actual trigger. In this case, the ketone levels are an indicator of what is going on, that the immune system is fully engaged. The important point, though, is this only happens in a ketogenic state and it has much to do with basil metabolic rate and body temperature regulation.

* * *

98.6 Degrees Fahrenheit Isn’t the Average Anymore
by Jo Craven McGinty

Nearly 150 years ago, a German physician analyzed a million temperatures from 25,000 patients and concluded that normal human-body temperature is 98.6 degrees Fahrenheit.

That standard has been published in numerous medical texts and helped generations of parents judge the gravity of a child’s illness.

But at least two dozen modern studies have concluded the number is too high.

The findings have prompted speculation that the pioneering analysis published in 1869 by Carl Reinhold August Wunderlich was flawed.

Or was it?

In a new study, researchers from Stanford University argue that Wunderlich’s number was correct at the time but is no longer accurate because the human body has changed.

Today, they say, the average normal human-body temperature is closer to 97.5 degrees Fahrenheit.

“That would be a huge drop for a population,” said Philip Mackowiak, emeritus professor of medicine at the University of Maryland School of Medicine and editor of the book “Fever: Basic Mechanisms and Management.”

Body temperature is a crude proxy for metabolic rate, and if it has fallen, it could offer a clue about other physiological changes that have occurred over time.

“People are taller, fatter and live longer, and we don’t really understand why all those things have happened,” said Julie Parsonnet, who specializes in infectious diseases at Stanford and is senior author of the paper. “Temperature is linked to all those things. The question is which is driving the others.” […]

Overall, temperatures of the Civil War veterans were higher than measurements taken in the 1970s, and, in turn, those measurements were higher than those collected in the 2000s.

“Two things impressed me,” Dr. Parsonnet said. “The magnitude of the change and that temperature has continued to decline at the same rate.” […]

“Wunderlich did a brilliant job,” Dr. Parsonnet said, “but people who walked into his office had tuberculosis, they had dysentery, they had bone infections that had festered their entire lives, they were exposed to infectious diseases we’ve never seen.”

For his study, he did try to measure the temperatures of healthy people, she said, but even so, life expectancy at the time was 38 years, and chronic infections such as gum disease and syphilis afflicted large portions of the population. Dr. Parsonnet suspects inflammation caused by those and other persistent maladies explains the temperature documented by Wunderlich and that a population-level change in inflammation is the most plausible explanation for a decrease in temperature.

Decreasing human body temperature in the United States since the Industrial Revolution
by Myroslava Protsiv, Catherine Ley, Joanna Lankester, Trevor Hastie, Julie Parsonnet


Jean-Francois Toussaint
Feb 15

This substantive and continuing shift in body temperature—a marker for metabolic rate—provides a framework for understanding changes in human health and longevity over 157 years.

Very interesting paper. Well done. However, a hypothesis still remains to be tested. The decline of the infectious burden well corresponds to the decrease of the body temperatures between the XIXth and XXth century cohorts (UAVCW vs NHANES), but it does not explain the further and much more important reduction between the XXth and XXIth century studies (NHANES vs STRIDE); see Figure 1 (distributions gap) and Figure 1 / Supp 1 (curve gap), where the impact seems to be twice as large between 1971 and 2007 than between 1860 and 1971.

Besides regulating the ambient room temperature (through winter heating in the early XXth century and summer air conditioning in the late XXth and early XXIth century), another hypothesis was not discussed here ie the significant decline in daily physical activity, one of the primary drivers of physiological heat production.

Regular physical activity alters core temperature even hours after exercising; 5h of moderate intensity exercise (60% VO2max) also increase the resting heart rate and metabolic rate during the following hours and night with a sympathetic nervous system activated until the next morning (Mischler, 2003) and higher body temperatures measured among the most active individuals (Aoyagi, 2018).

As in most developed countries, the North American people – who worked hard in agriculture or industry during the XIXth century – lost their active daily habits. We are now spending hours, motionless in front of our screens, and most of our adolescents follow this unsettling trend (Twenge, 2019); such an effect on temperature and energy regulation should also be considered as it may have an important impact on the potential progresses of their life expectancy and life duration.

Jean-François Toussaint Université de Paris, Head IRMES

Mischler I, et al. Prolonged Daytime Exercise Repeated Over 4 Days Increases Sleeping Heart Rate and Metabolic Rate. Can J Appl Physiol. Aug 2003; 28 (4): 616-29 DOI: 10.1139/h03-047

Aoyagi Y, et al. Objectively measured habitual physical activity and sleep-related phenomena in 1645 people aged 1–91 years: The Nakanojo Community Study. Prev Med Rep. 2018; 11: 180-6 DOI: 10.1016/j.pmedr.2018.06.013

Twenge JM, et al. Trends in U.S. Adolescents’ media use, 1976–2016: The rise of digital media, the decline of TV, and the (near) demise of print. Psychol Pop Media Cult, 2019; 8(4): 329-45. DOI: 10.1037/ppm0000203

Nita Jain
(edited Feb 15) Feb 14

Although there are many factors that influence resting metabolic rate, change in the population-level of inflammation seems the most plausible explanation for the observed decrease in temperature over time.

Reduced body temperature measurements may also be the result of loss of microbial diversity and rampant antibiotic use in the Western world. Indeed, the authors mention that a small study of healthy volunteers from Pakistan reported higher mean body temperatures than those encountered in developed countries where exposure to antimicrobial products is greater.

Rosenberg et al. reported that heat provision is an under-appreciated contribution of microbiota to hosts. Previous reports have estimated bacterial specific rates of heat production at around 168 mW/gram. From these findings, we can extrapolate that an estimated 70% of human body heat production in a resting state is the result of gut bacterial metabolism.

Consistent with this idea are reports that antibiotic treatment of rabbits and rodents lowers body temperature. Germ-free mice and piglets similarly displayed decreased body temperatures compared to conventionally raised animals and did not produce a fever in response to an infectious stimulus.

Although heat production by symbiotic microbes appears to be a general phenomenon observed in both plants and animals, its significance in humans has hardly been studied. Nonetheless, the concomitant loss of diversity and heat contribution of the gut microbiota may have far-reaching implications for host metabolic health.

A Microbial-Based Explanation for Cooling Human Body Temperatures
by Nita Jain

I would like to propose that our reduced body temperature measurements may be the result of loss of microbial diversity and rampant antibiotic use in the Western world. Indeed, a small study of healthy volunteers from Pakistan reported higher mean body temperatures than those encountered in developed countries where exposure to antimicrobial products is greater.

Heat provision is an under-appreciated contribution of microbiota to hosts. Microbes produce heat as a byproduct when breaking down dietary substrates and creating cell materials. Previous reports have estimated bacterial specific rates of heat production at around 168 mW/gram. From these findings, we can extrapolate that an estimated 70% of human body heat production in a resting state is the result of gut bacterial metabolism.

Consistent with this idea are reports that antibiotic treatment of rabbits and rodents lowers body temperature. Germ-free mice and piglets similarly displayed decreased body temperatures compared to conventionally raised animals and did not produce a fever in response to an infectious stimulus. The relationship also appears to be bi-directional, as host tolerance to cold has been shown to drive changes in the gut microbiomes of blue tilapia.

Heat production in goats was found to decrease by about 50% after emptying the rumen to values similar to what would be expected during a fasting state. These observations suggest that during fasting, microbial fermentation is responsible for half of the animal’s heat production while host metabolism accounts for the other half. The warming effect of microbes has also been reported in plants. Yeast populations residing in floral nectar release heat when breaking down sugar, increasing nectar temperature and modifying the internal flower microenvironment.

Diet and Industrialization, Gender and Class

Below are a couple of articles about the shift in diet since the 19th century. Earlier Americans ate a lot of meat, lard, and butter. It’s how everyone ate — women and men, adults and children — as that was what was available and everyone ate meals together. Then there was a decline in consumption of both red meat and lard in the early 20th century (dairy has also seen a decline). The changes created a divergence in who was eating what.

It’s interesting that, as part of moral panic and identity crisis, diets became gendered as part of reinforcing social roles and the social order. It’s strange that industrialization and gendering happened simultaneously, although maybe it’s not so strange. It was largely industrialization in altering society so dramatically that caused the sense of panic and crisis. So, diet also became heavily politicized and used for social engineering, a self-conscious campaign to create a new kind of society of individualism and nuclear family.

This period also saw the rise of the middle class as an ideal, along with increasing class anxiety and class war. This led to the popularity of cookbooks within bourgeois culture, as the foods one ate not only came to define gender identity but also class identity. As grains and sugar were only becoming widely available in the 19th century with improved agriculture and international trade, the first popular cookbooks were focused on desert recipes (Liz Susman Karp, Eliza Leslie: The Most Influential Cookbook Writer of the 19th Century). Before that, deserts had been limited to the rich.

Capitalism was transforming everything. The emerging industrial diet was self-consciously created to not only sell products but to sell an identity and lifestyle. It was an entire vision of what defined the good life. Diet became an indicator of one’s place in society, what one aspired toward or was expected to conform to.

* * *

How Steak Became Manly and Salads Became Feminine
Food didn’t become gendered until the late 19th century.
by Paul Freedman

Before the Civil War, the whole family ate the same things together. The era’s best-selling household manuals and cookbooks never indicated that husbands had special tastes that women should indulge.

Even though “women’s restaurants” – spaces set apart for ladies to dine unaccompanied by men – were commonplace, they nonetheless served the same dishes as the men’s dining room: offal, calf’s heads, turtles and roast meat.

Beginning in the 1870s, shifting social norms – like the entry of women into the workplace – gave women more opportunities to dine without men and in the company of female friends or co-workers.

As more women spent time outside of the home, however, they were still expected to congregate in gender-specific places.

Chain restaurants geared toward women, such as Schrafft’s, proliferated. They created alcohol-free safe spaces for women to lunch without experiencing the rowdiness of workingmen’s cafés or free-lunch bars, where patrons could get a free midday meal as long as they bought a beer (or two or three).

It was during this period that the notion that some foods were more appropriate for women started to emerge. Magazines and newspaper advice columns identified fish and white meat with minimal sauce, as well as new products like packaged cottage cheese, as “female foods.” And of course, there were desserts and sweets, which women, supposedly, couldn’t resist.

How Crisco toppled lard – and made Americans believers in industrial food
by Helen Zoe Veit

For decades, Crisco had only one ingredient, cottonseed oil. But most consumers never knew that. That ignorance was no accident.

A century ago, Crisco’s marketers pioneered revolutionary advertising techniques that encouraged consumers not to worry about ingredients and instead to put their trust in reliable brands. It was a successful strategy that other companies would eventually copy. […]

It was only after a chemist named David Wesson pioneered industrial bleaching and deodorizing techniques in the late 19th century that cottonseed oil became clear, tasteless and neutral-smelling enough to appeal to consumers. Soon, companies were selling cottonseed oil by itself as a liquid or mixing it with animal fats to make cheap, solid shortenings, sold in pails to resemble lard.

Shortening’s main rival was lard. Earlier generations of Americans had produced lard at home after autumn pig slaughters, but by the late 19th century meat processing companies were making lard on an industrial scale. Lard had a noticeable pork taste, but there’s not much evidence that 19th-century Americans objected to it, even in cakes and pies. Instead, its issue was cost. While lard prices stayed relatively high through the early 20th century, cottonseed oil was abundant and cheap. […]

In just five years, Americans were annually buying more than 60 million cans of Crisco, the equivalent of three cans for every family in the country. Within a generation, lard went from being a major part of American diets to an old-fashioned ingredient. […]

In the decades that followed Crisco’s launch, other companies followed its lead, introducing products like Spam, Cheetos and Froot Loops with little or no reference to their ingredients.

Once ingredient labeling was mandated in the U.S. in the late 1960s, the multisyllabic ingredients in many highly processed foods may have mystified consumers. But for the most part, they kept on eating.

So if you don’t find it strange to eat foods whose ingredients you don’t know or understand, you have Crisco partly to thank.


Most Americans Don’t Know Real Reason Japan Was Bombed

United States bombing Japan in the Second World War was a demonstration of psychopathic brutality. It was unnecessary, as Japan was already defeated, but it was meant to send a message to the Soviets. Before the dust had settled from the savagery, the power-mongers among the Allied leadership were already planning for a Third World War (Cold War Ideology and Self-Fulfilling Prophecies), even though the beleaguered Soviets had no interest in more war as they took the brunt of the decimation and death count in defeating the Nazis.

The United States, in particular, having come out wealthier after the war thought that the Soviets would be an easy target to take out and so they sought to kick their former allies while they were still down. The US, in a fit of paranoia and psychosis, was scheming to drop hundreds of atomic bombs on Russia, to eliminate them before they could get the chance to develop their own nuclear weapons. Yet Stalin never planned, much less intended, to attack the West nor did he think they had the capacity to do so. All of the archives that were opened after the Soviet collapse showed that Stalin simply wanted to develop a trading partnership with the West, as he stated was his intention. Through the intervention of spies, the Soviets did start their own nuclear program and then demonstrated their capacity. So, a second nuclear attack by the United States was narrowly averted and  the Third World War was downgraded to the Cold War (see article and book at the end of the post).

This topic has come up before in this blog, but let’s come at it from a different angle. Consider General Douglas MacArthur. He was no pacifist or anything close to approximating one. He was a megalomaniac with good PR, a bully and a jerk, an authoritarian and would-be strongman hungering for power and fame. He “publicly lacked introspection. He was also vain, borderline corrupt, ambitious and prone to feuds” (Andrew Fe, Why was General MacArthur called “Dugout Doug?”). Also, he was guilty of insubordination, always certain he was right; and the times that events went well under his command were often because he took credit for other people’s ideas, plans and actions. His arrogance eventually led him to being removed from his position and that ended his career.

He was despised by many who worked with him and served under him. “President Harry Truman considered MacArthur a glory-seeking egomaniac, describing him at one point as “God’s right hand man” ” (Alpha History, Douglas MacArthur). Dwight Eisenhower, who knew him well from years of army service, “disliked MacArthur for his vanity, his penchant for theatrics, and for what Eisenhower perceived as “irrational” behavior” (National Park Service, Most Disliked Contemporaries). MacArthur loved war and had psychopathic level of disregard for the lives of others, sometimes to the extent of seeking victory at any cost. There are two examples that demonstrate this, one before the Second World War and the other following after.

Early in his career with Eisenhower and George S. Patton under his command, there was the infamous attack on the Bonus Army camp, consisting of WWI veterans — along with their families — protesting for payment of the money they were owed by the federal government (Mickey Z., The Bonus Army). He was ordered to remove the protesters but to do so non-violently. Instead, as became a pattern with him, he disobeyed those orders by having the protesters gassed and the camp trampled and torched. This led to the death of several people, including an infant. This was one of his rare PR disasters, to say the least. And trying to sue journalists for libel didn’t help.

The later example was in 1950. In opposition to President Harry Truman, “MacArthur favored waging all-out war against China. He wanted to drop 20 to 30 atomic bombs on Manchuria, lay a “radioactive belt of nuclear-contaminated material” to sever North Korea from China, and use Chinese Nationalist and American forces to annihilate the million or so Communist Chinese troops in North Korea” (Max Boot, He Has Returned). Some feared that, if the General had his way, he might start another world war… or rather maybe the fear was about China not being the preferred enemy some of the ruling elite wanted to target for the next world war.

Certainly, he was not a nice guy nor did he have any respect for democracy, human rights, or any other such liberal values. If he had been born in Germany instead, he would have made not merely a good Nazi but a great Nazi. He was a right-wing reactionary and violent imperialist, as he was raised to be by his military father who modeled imperialist aspirations (Rethinking History, Rating General Douglas MacArthur). He felt no sympathy or pity for enemies. Consider how he was willing to treat his fellow citizens, including some veterans in the Bonus Army who served beside him in the previous world war. His only loyalty was to his own sense of greatness and the military-industry that promoted him into power.

But what did General MacArthur, right-wing authoritarian that he was, think about dropping atomic bombs on an already defeated Japan? He thought it an unnecessary and cruel act toward a helpless civilian population consisting mostly of women, children and the elderly; an opinion he shared with many other military leaders at the time. Besides, as Norman Cousins, consultant to General MacArthur during the occupation of Japan, wrote, “MacArthur… saw no military justification for dropping of the bomb. The war might have ended weeks earlier, he said, if the United States had agreed, as it later did anyway, to the retention of the institution of the emperor” (quoted in Cameron Reilly’s The Psychopath Epidemic).

There was no reason, in his mind, to destroy a country when it was already defeated and instead could serve the purposes of the American Empire. For all of his love of war and violence, he showed no interest in vengeance or public humiliation toward the Japanese people. After the war, he was essentially made an imperial administrator and colonial governor of Japan, and he ruled with paternalistic care and fair-minded understanding. War was one thing and ruling another. Even an authoritarian should be able to tell the difference between these two.

It made no sense, the reasons given for incinerating two large cities and their populations in a country that couldn’t have fought back at that point even if the leadership had wanted to. What MacArthur understood was that the Japanese simply wanted to save face as much as possible while coming to terms with defeat and negotiating their surrender. Further violence was simply psychopathic brutality. There is no way of getting around that ugly truth. So, why have Americans been lied to and indoctrinated to believe otherwise for generations since? Well, because the real reasons couldn’t be given.

The atomic bombing wasn’t an act to end a war but to start another one, this time against the Soviets. To honestly and openly declare a new war before the last war had even ended would not have gone over well with the American people. And once this action was taken it could never be revealed, not even when all those involved had long been dead. Propaganda narratives, once sustained long enough, take on a life of their own. The tide is slowly turning, though. As each generation passes, fewer and fewer remain who believe it was justified, from 85 percent in 1945 to 56 percent in 2015.

When the last generation raised on WWII propaganda dies, that percentage will finally drop below the 50 percent mark and maybe we will then have an honest discussion about the devastating results of moral failure that didn’t end with those atomic bombs but have been repeated in so many ways since then. The crimes against humanity in bombing of Japan were echoed in the travesty of the Vietnam War and the Iraq War. Millions upon millions dead over the decades from various military actions by the Pentagon and covert operations by the CIA combined with sanctions that are considered declarations of war. Sanctions, by the way, were what incited the Japanese to attack the United States. In enforcing sanctions against a foreign government, the United States entered the war of its own volition by effectively declaring war against Japan and then acted surprised when they defended themselves.

All combined, through direct and indirect means, that possibly adds up into hundreds of millions in body count of innocents sacrificed so far since American imperial aspirations began. This easily matches the levels of atrocity seen in the most brutal regimes of the past (Investing in Violence and Death, Endless Outrage, Evil Empire, & State and Non-State Violence Compared). The costs are high. When will there be a moral accounting?

* * *

Hiroshima, Nagasaki, and the Spies Who Kept a Criminal US with a Nuclear Monopoly from Making More of Them
by Dave Lindorff

It was the start of the nuclear age. Both bombs dropped on Japan were war crimes of the first order, particularly because we now know that the Japanese government, which at that time was having all its major cities destroyed by incendiary bombs that turned their mostly wooden structures into towering firestorms, was even before Aug. 6, desperately trying to surrender via entreaties through the Swiss government.

The Big Lie is that the bomb was dropped to save US troops from having to invade Japan. In fact, there was no need to invade. Japan was finished, surrounded, the Russians attacking finally from the north, its air force and navy destroyed, and its cities being systematically torched.

Actually, the US didn’t want Japan to surrender yet though.Washington and President Harry Truman wanted to test their two new super weapons on real urban targets, and even more importantly, wanted to send a stark message to the Soviet Union, the supposed World War II ally which US war strategists and national security staff actually viewed all through the conflict as America’s next existential enemy.

As authors Michio Kaku and Daniel Axelrod, two theoretical physicists, wrote in their frightening, disturbing and well researched book To Win a Nuclear War: The Pentagon’s Secret War Plans (South End Press, 1987), the US began treacherously planning to use its newly developed super weapon, the atom bomb, against the war-ravaged Soviet Union, even before the war had ended in Europe. Indeed a first plan, to drop 20-30 Hiroshima-sized bombs on 20 Russian Cities, code named JIC 329/1, was intended to be launched in December 1945. Fortunately that never happened because at that point the US only had two atomic bombs in its “stockpile.”

The describe how as the production of new bombs sped up, with 9 nuclear devices by June 1946, 35 by March 1948 and 150 by January 1949, new plans with such creepy names as Operations Pincher, Broiler, Bushwacker, Sizzle and Dropshot were developed, and the number of Soviet cities to be vaporized grew from 20 to 200.

Professors Kaku and Axelrod write that Pentagon strategists were reluctant to go forward with these early planned attacks not because of any unwillingness to launch an unprovoked war, but out of a fear that the destruction of Soviet targets would be inadequate to prevent the Soviet’s still powerful and battle-tested Red Army from responding by over-running war-ravaged Europe in response to such an attack—a counterattack the US would not have been able to prevent. These strategists recommended that no attack be made until the US military had at least 300 nukes at its disposal (remember, at this time there were no hydrogen bombs, and the size of fission bomb was  constrained by the small size of the core’s critical mass). It was felt, in fact, that the bombs were so limited in power that it could take two or three to decimate a city like Moscow or Leningrad.

So the plan for wiping out the Soviet Union was gradually deferred to January 1953, by which time it was estimated that there would be 400 larger Nagasaki bombs available, and that even if only 100 of these 25-50 kiloton weapons hit their targets it could “implement the concept of ‘killing a nation.’”

The reason this epic US holocaust never came to pass is now clear: to the astonishment of US planners and even many  of the US nuclear scientists who had worked so hard in the Manhattan Project to invent and produce the atomic bomb (two types of atomic bomb, really), in August 29, 1949 the Soviets exploded their own bomb, the “First Lightning”: an almost exact replica of the “Fat Man” Plutonium bomb that destroyed Nagasaki four years earlier.

And the reason the Soviet scientists, brilliant as they were but financially strapped by the massive destruction the country had suffered during the war, had been able to create their bomb in roughly the same amount of time that the hugely funded Manhattan Project had done was primarily the information provided by a pair of scientists working at Los Alamos who offered detailed plans, secrets about how to work with the very tricky and unpredictable element Plutonium, and how to get a Plutonium core to explode in a colossal fireball instead of just producing a pathetic “fizzle.”

The Psychopath Epidemic
by Cameron Reilly

Another of my favorite examples of the power of brainwashing by the military-industrial complex is that of the bombings of Hiroshima and Nagasaki by the United States in 1945. Within the first two to four months of the attacks, the acute effects killed 90,000-166,000 people in Hiroshima and 60,000-80,000 in Nagasaki, with roughly half of the deaths in each city occurring on the first day. The vast majority of the casualties were civilians.

In the seventy-three years that have passed since Hiroshima, poll after poll has shown that most Americans think that the bombings were wholly justified. According to a survey in 2015, fifty-six percent of Americans agreed that the attacks were justified, significantly less than the 85 percent who agreed in 1945 but still high considering the facts don’t support the conclusion.

The reasons most Americans cite for the justification of the bombings is that they stopped the war with Japan; that Japan started the war with the attack on Pearl Harbor and deserved punishment; and that the attacks prevented Americans from having to invade Japan causing more deaths on both sides. These “facts” are so deeply ingrained in most American minds that they believe them to be fundamental truths. Unfortunately, they don’t stand up to history.

The truth is that the United States started the war with Japan when it froze Japanese assets in the United States and embargoed the sale of oil the country needed. Economic sanctions then, as now, are considered acts of war.

As for using the bombings to end war, the U.S. was well aware in the middle 1945 that the Japanese were prepared to surrender and expected it would happen when the USSR entered the war against them in August 1945, as pre-arranged between Truman and Stalin. The primary sticking point for the Japanese was the status of Emperor Hirohito. He was considered a god by his people, and it was impossible for them to hand him over for execution by their enemies. It would be like American Christians handing over Jesus, or Italian Catholics handing over the pope. The Allies refused to clarify what Hirohito’s status would be post-surrender. In the end, they left him in place as emperor anyway.

One American who didn’t think using the atom bomb was necessary was Dwight Eisenhower, future president and, at the time, the supreme allied commander in Europe. He believed:

Japan was already defeated and that dropping the bomb was completely unnecessary, and… the use of a weapon whose employment was, I though, no longer mandatory as a measure to save American lives. It was my belief that Japan was, at that very moment, seeking some way to surrender with a minimum loss of “face.”…

Admiral William Leahy, chief of staff to Presidents Franklin Roosevelt and Harry Truman, agreed.

It is my opinion that the use of this barbarous weapon at Hiroshima and Nagasaki was of no material assistance in our war against Japan. The Japanese were already defeated and ready to surrender because of the effective sea blockade and the successful bombing with conventional weapons. My own feeling was that in being the first to use it, we had adopted an ethical standard common to the barbarians of the Dark Ages. I was not taught to maek war in that fashion, and wars cannot be won by destroying women and children.

Norman Cousins was a consultant to General MacArthur during the American occupation of Japan. Cousins wrote that

MacArthur… saw no military justification for dropping of the bomb. The war might have ended weeks earlier, he said, if the United States had agreed, as it later did anyway, to the retention of the institution of the emperor.

If General Dwight Eisenhower, General Douglas MacArthur, and Admiral William Leahy all believed dropping atom bombs on Japan was unnecessary, why do so many American civilians still today think it was?

Probably because they have been told to think that, repeatedly, in a carefully orchestrated propaganda campaign, enforced by the military-industrial complex (that Eisenhower tried to warn us about), that has run continuously since 1945.

As recently as 1995, the fiftieth anniversary of the bombings of Hiroshima and Nagasaki, the Smithsonian Institute was forced to censor its retrospective on the attacks under fierce pressure from Congress and the media because it contained “text that would have raised questions about the morality of the decision to drop the bomb.”

On August 15, 1945, about a week after the bombing of Nagasaki, Truman tasked the U.S. Strategic Bombing Survey to conduct a study on the effectiveness of the aerial attacks on Japan, both conventional and atomic. Did they affect the Japanese surrender?

The survey team included hundreds of American officers, civilians, and enlisted men, based in Japan. They interviewed 700 Japanese military, government, and industry officials and had access to hundreds of Japanese wartime documents.

Less than a year later, they published their conclusion—that Japan would likely have surrendered in 1945 without the Soviet declaration of war and without an American invasion: “It cannot be said that the atomic bomb convinced the leaders who effected the peace of the necessity of surrender. The decision to surrender, influenced in part by knowledge of the low state of popular morale, had been taken at least as early as 26 June at a meeting of the Supreme War Guidance Council in the presence of the Emperor.”

June 26 was six weeks before the first bomb was dropped on Hiroshima. The emperor wanted to surrender and had been trying to open up discussions with the Soviets, the only country with whom they still had diplomatic relations.

According to many scholars, the final straw would have come on August 15 when the Soviet Union, as agreed months previously with the Truman administration, were planning to declare they were entering the war with Japan.

But instead of waiting, Truman dropped the first atomic bomb on Japan on August 6.

The proposed American invasion of the home islands wasn’t scheduled until November.

Mid-20th Century American Peasant Communities

Industrial capitalism is radically new. Late stage capitalism came rather late. That is true for the United States. Until about a century ago, most Americans still lived in rural communities, worked manual labor on small family farms, survived on subsistence in growing most of their own food, and bought what little else they needed through store tabs and barter. Many Americans were still living this way within living memory. A few such communities persist in places across the United States.

Segmented Worlds and Self
by Yi-Fu Tuan
pp. 17-21

Most peasants are villagers, members of cohesive communities. What is the nature of this cohesion and how is it maintained? A large and varied literature on peasants exists—historical studies of villages in medieval Europe and ethnographic surveys of peasant economy and livelihood in the poorer parts of the world at the turn of the century. Though differing from each other in significant details, peasant worlds nonetheless share certain broad traits that distinguish them from urban and modern societies. First, peasants establish intimate bonds with the land: that one must labor hard to survive is an accepted truth that is transformed into a propitiary and pious sentiment toward Mother Earth. Deities of the soil and ancestral spirits become fused. Peasants see themselves as belonging to the land, “children of the earth,” a link between past and future, ancestors and progeny. Biological realities and metaphors, so common in the peasant’s world, tend to suppress the idea of the self as a unique end or as a person capable of breaking loose from the repetitive and cyclical processes of nature to initiate something radically new. Although peasants may own the land they work on, they work more often in teams than individually. Many agricultural activities require cooperation; for example, when the fields need to be irrigated and drained, or when a heavy and expensive piece of equipment (such as the mill, winepress, or oven belonging to the landlord) is to be used. Scope for individual initative is limited except in small garden plots next to the house, and even there customary practices prevail. Individualism and individual success are suspect in peasant communities. Prosperity is so rare that it immediately suggest witchcraft.

In the peasant’s world the fundamental socioeconomic unit is the extended family, members of which—all except the youngest children—are engaged in some type of productive work. They may not, however, see much of each other during the day. Dinnertime may provide the only opportunity for family togetherness, when the webs of affection and lines of authority become evident to all. More distant relatives are drawn into the family net on special occasions, such as weddings and funerals. Besides kinsfolk, villagers can count on the assistance of neighbors when minor needs arise, whether for extra hands during harvest, for tools, or even for money. In southeast China, a neighborhood is clearly defined as five residences to each side of one’s own. Belonging to a neighborhood gives one a sense of security that kinsfolk alone cannot provide. Villagers are able to maintain good neighborly relations with each other because they have the time to socialized. In Europe the men may go to a tavern, where after a few  beers they feel relaxed enough to sing together— that most comradely of human activities. In China the men, and sometimes the women as well, may go to a teahouse in a market town, where they can exchange gossip among themselves and with visitors from other villages. More informally, neighbors meet to chat and relax in the village square in the cool of the evening. Peasants desire contentment rather than success, and contentment means essentially the absence of want. When a man achieves a certain level of comfort he is satisfied. He feels no compulsion to use his resource and energy for higher economic rewards. He has the time and sense of leisure to hobnob with his fellows and bathe in their undemanding good will. Besides these casual associations, peasants come together for planned festivals that might involve the entire village. the New Year and the period after harvest are such occasions in many parts of the world. the number of festivals and the days on which they occur vary from place to place, but without exception festivals come to pass when people are relatively free, that is, during the lax phases of the calendar year.

Festivals, of course, strengthen the idea of group self. Tehse are the times when the people as a whole express their joy in the success of a harvest, or the growing strength of the sun. Simultaneously, they reaffirm their piety toward the protective deities of the earth and sky, their sense of onennss with nature. Group cohesiveness s a product of need, a fact that is manifest n the traditional world of villagers at different scales, ranging from that of family and kinsfolk, through those of neighbors and work team, to the entire community as it celebrates the end of a period of toil or the passing of a crisis of natue, or as it is girded in self-defense against natural calamity or human predators. Necessity is not a condition that human beings can contemplate for long without transforming it into an ideal. Thus, the cooperation necessary to survival becomes a good in itself, a desirable way of life. Units of mutual help achieve strong identities that can persist long after the urgencies that called them into existence have passed. In such groups, forged initially out of need but sustained thereafter by a sense of collective superiority, wayward and questioning individuals have no place.

A common image of America is that it is a land of individualists. Even in the colonial period, when towns were small and isolated, intimately knit communal groups like those of Europe did not exist. The people who lived in them, particularly in the Middle Colonies, shared too few common traditions and habits. Moreover, they were continually moving in and out. In New England, where settlers made periodic attempts to establish communities artificially by means of consciously constructed models, the results were mixed in relation to satisfaction and permanence. In the countryside, the Jeffersonian ideal of the yeoman farmer seems to have held sway. Nevertheless, not only individualists but families and clusters of families migrated to the frontier, and in the course of time some of them became deeply rooted agglutinate communities, in which such characteristic American ideals as upward social mobility, individual intiative, and success were alien.

Traditional farming communities, relics from the past, persist in rural America at mid-twentieth century. Consider the sixty-odd families whose roots in the hollows of Tennessee, a few miles south of Nashville, go back to 1756. Over a course of two hundred years, inter-marriage has produced the closest bonds. Natural warmth between kinsfolk and neighbors is reinforced by a deep suspicion of outsiders. The community is strongly egalitarian. Work roles differ by age and sex, but social stratification as it exists in most parts of the country is unknown. “In work terms,” writes John Mogey, “no one is clearly leader: collective responsibility for work assignment is the rule to an extent that to speak of individual or family farming enterprises would be to violate the facts.” In her study of this community, Elmora Matthews notes how warm feelings between farmers can emerge from a combination of blood ties, laboring at common tasks, and informal socializing. One woman described the relation between her four brothers, who have adjoining farms: “They work all day long together, eat their meals together, and then always sit around and visit with each other before they go home. ” Ambition and even efficiency, when it is obtrusive, are bad. On the other hand, “no one ever condemns a husband who evades his work. If anything, a man who sits around home a lot blesses a family group.” One of the most respectable activities for man is to loaf and loiter with other men. The greatest satisfaction lies in the warm exchange of feeling among relatives and close friends at home, church, or store.

People in this Tennessee community almost never organize formally for special ends. There are no communal projects. The community is not a provisional state that might be altered and improved upon, or used for some larger, ulterior purpose. It is the supreme value and sole reality: whatever threatens to disrupt it is bad. Critical self-awareness seems minimal. Thus, although this Tennessee people fervently believe in freedom, anyone who exercises it to develop his talent and becomes a success is harshly judged. Thorough conformists in thinking and behavior, they nevertheless resent the government for its tendency to impose rules and regulations, and they regard communism as unimaginably horrible.

Close-knit communities of this kind can be found in the more isolated countrysides of Western Europe and North America even in the middle of the twentieth century.

Rainbow Pie: A Redneck Memoir
by Joe Bageant
pp. 15-20

When Virginia Iris Gano and Harry Preston Bageant crested that ridge in their buggy and began their life together, they stood an excellent chance of making it. For starters, in that world the maths of life was easier, even if the work was harder. If you could show the bank or the seller of the land that you were healthy and sober, and knew how to farm, you pretty much had the loan (at least when it came to the non-arid eastern American uplands; the American West was a different matter). At 5 percent simple interest, Pap bought a 108-acre farm — house, barn, and all — for $400. (It was a cash-poor county, and still is. As recently as 1950 you could buy a 200-acre farm there for about $1,000.) On those terms, a subsistence farmer could pay off the farm in twenty years, even one with such poor soils as in these Southern uplands. But a subsistence farmer did not farm to sell crops, though he did that, too, when possible. Instead, he balanced an entire life with land and human productivity, family needs, money needs, along with his own and his family’s skills in a labor economy, not a wealth economy. The idea was to require as little cash as possible, because there wasn’t any to be had.

Nor was much needed. The farm was not a business. It was a farm. Pap and millions of farmers like him were never in the “agribusiness”. They never participated in the modern “economy of scale” which comes down to exhausting as many resources as possible to make as much money as possible in the shortest time possible. If you’d talked to him about “producing commodities under contract to strict specifications”, he wouldn’t have recognized that as farming. “Goddamned jibber-jabber” is what he would have called it. And if a realtor had pressed him about the “speculative value” of his farmland as “agronomic leverage”, I suspect the old 12-gauge shotgun might have come down off the rack. Land value was based upon what it could produce, plain and simple. These farms were not large, credit-based “operations” requiring annual loans for machinery, chemicals, and seed.

Sure, farmers along Shanghai Road and the Unger Store community bought things at the junction store on credit, to be paid for in the autumn. Not much, though. The store’s present owners, descendants of the store’s founders, say that an annual bill at the store would run to about ten dollars. One of them, Richard Merica, told me, “People bought things like salt and pepper. Only what they couldn’t make for themselves, like shotgun shells or files.” Once I commented to an old Unger Store native still living there that, “I suspect there wasn’t more than $1,000 in the Unger Store community in the pre-war days.”

“You’re guessing way too high,” he said. “Try maybe $400 or $500. But most of it stayed here, and went round and round.”

So if Pap and the other subsistence farmers there spent eight bucks a year at the local crossroads store, it was eight bucks in a reciprocal exchange that made both their subsistence farming and the Unger Store possible as a business and as a community.

Moneyless as it was, Maw and Pap’s lives were far more stable than one might think today. In fact, the lives of most small farmers outside the nasty cotton sharecropping system of deep-southern America were stable. Dramatic as the roller-coaster economics of the cities and the ups and downs caused by crop commodity speculators in Chicago were, American farm life remained straightforward for the majority. Most were not big Midwestern broad-acre farmers who could be destroyed by a two-cent change in the price of wheat. Wheat in Maw and Pap’s time hovered at around fifty to fifty-five cents a bushel; corn, at forty-five; and oats at about fifty-six. Multiply the acreage by average bushels per acre for your piece of land, and you had a start at figuring out a realistic basis for your family’s future. It was realistic enough that, after making allowances for bad years, plus an assessment of the man seeking the loan, the banks lent Pap the price of a farm. That assessment was not shallow.

Pap was expected to bring to the equation several dozen already-honed skills, such as the repair, sharpening, and use of tools (if you think that is simple, try laying down wheat with a scythe sometime); the ability to husband several types of animal stock; and experience and instinct about soils and terrain, likely weather, and broadcasting seed by hand. Eastern mountain subsistence farms needed little or no planting equipment because plots were too small and steep. What harvesting equipment such as reapers and threshers might be needed was usually owned by one man who made part of his living reaping and threshing for the rest of the community. Other skills included planting in cultivated ridges, managing a woodlot, and estimating hours of available sunlight for both plant growth and working. The subsistence farm wife’s life required as much experience and skill on a different front of family provision.

That said, Pap wasn’t a particularly good farmer. He wasn’t a bad farmer, either. He was just an average farmer among millions of average farmers. The year my grandparents married, about 35 million Americans were successfully engaged in farming, mostly at a subsistence level. It’s doubtful that they were all especially gifted, or dedicated or resourceful. Nevertheless, their kind of human-scale family farming proved successful for twelve generations because it was something more — a collective consciousness rooted in the land that pervaded four-fifths of North American history.

They farmed with the aid of some 14 million draft horses and God only knows how many mules. Pap wasn’t much for mules; all the farming he had to do could easily be done with one horse. Without going into a treatise on horse farming, let me say that, around 1955 at the age of ten, I saw the last of Pap’s work horses in use, a coal-black draft animal named “Nig” (short for nigger, of course). By then, Nig, who was Nig number three, if I remember correctly, was over twenty years old, and put out to pasture — a loose use of the term, given that he spent his time in the shade of the backyard grape arbor waiting to be hand-fed treats. But Nig still pulled a single tree-plow in a four-acre truck garden down in the bottom land — mostly melons, tomatoes, and sweet corn — while I sometimes rode atop barefoot holding onto the wooden hames at the collar. Pap walked behind, guiding the plow. “Gee Nig! Haw Nig! Step right … Turn and baaack. Cluck-cluck.” The rabbit dogs, Nellie and Buck, trotted alongside in the spring sun.

Though Pap owned a tractor by then — a beaten-up old Farmall with huge, cleated steel wheels, a man-killer prone to flipping over backward and grinding the driver bloodily under the cleats — he could still do all his cultivation walking behind Nig in the spring. In summer he’d scratch out the weeds with a horseless garden plow, or “push plow”, and pick off bugs by hand, dropping them into a Maxwell House coffee can half-filled with kerosene. Pap hand-harvested most things, even large cornfields, using a corn cutter fashioned from an old Confederate sword. But it is that old horse and that old man with the long leather lines thrown up over his shoulders, the plow in his iron grip, and cutting such straight lines in the red clay and shale, that I remember most fondly. He made it look easy. Fifty years in the furrows will do that.

pp. 41-53

THE CULTURAL VALUES MAY REMAIN, HANGING over everything political and many things that are not, but there are few if any remaining practitioners of the traditional family or community culture by which Pap and Maw lived — the one with the woman in the home, and the man in the fields, although Maw certainly worked in the fields when push came to shove. This is not to advocate such as the natural order of things. I am neither Amish nor Taliban. But knee-jerk, middle-class, mostly urban feminists might do well to question how it all started and what the result has been — maybe by getting out and seeing how few of their sisters gutting chickens on the Tyson’s production line or telemarketing credit cards on the electronic plantation relish those dehumanizing jobs that they can never quit.

It would do them well to wonder why postwar economists and social planners, from their perches high in the executive and management class, deemed it best for the nation that more mothers become permanent fixtures of America’s work force. This transformation doubled the available labor supply, increased consumer spending, and kept wages lower than they would have otherwise been. National production and increased household income supposedly raised everyone’s quality of life to stratospheric heights, if Formica countertops and “happy motoring” can be called that. I’m sure it did so for the managing and owning classes, and urban people with good union jobs. In fact, it was the pre-war trade unions at full strength, particularly the United Auto Workers, that created the true American middle class, in terms of increased affluence for working people and affordable higher education for their children.

What Maw and Pap and millions of others got out of it, primarily, were a few durable goods, a washing machine, a television, and an indoor toilet where the pantry, with its cured meats, 100-pound sacks of brown sugar, flour, and cases of eggs had been. Non-durable commodities were vastly appreciated, too. One was toilet paper, which ended generations of deep-seated application of the pages of the Sears Roebuck mail-order catalog to the anus (the unspoken limit seemed to be one page to a person at a sitting). The other was canned milk, which had been around a long time, but had been unaffordable. Milk cows are a wonderful thing, but not so good when two wars and town work have drained off your family labor-supply of milkers. […]

The urging of women into the workplace, first propagandized by a war-making state, was much romanticized in the iconic poster image of Rosie the Riveter, with her blue-denim sleeves rolled up and a scarf tied over her hair. You see the image on the refrigerator magnets of fuzzy-minded feminists-lite everywhere. This liberal identity-statement is sold by the millions at Wal-Mart, and given away as a promotional premium by National Public Radio and television.

Being allowed to manufacture the planes that bombed so many terrified European families is now rewritten as a feminist milestone by women who were not born at the time. But I’ve never once heard working-class women of that period rave about how wonderful it was to work long days welding bomb-bay doors onto B-29s.

The machinery of state saw things differently, and so the new reality of women building war machinery was dubbed a social advance for American womankind, both married and single. In Russia, it was ballyhooed as Soviet socialist-worker equality. And one might even believe that equality was the prime motive, when viewed sixty years later by, for instance, a university-educated specimen of the gender writing her doctoral dissertation. But for the children and grandchildren of Rosie the Riveter, those women not writing a dissertation or thesis, there is less enthusiasm. Especially among working mothers. The Pew Research Center reports that only 13 percent of working mothers think that working benefits their children. But nearly 100 percent feel they have no choice. Half of working mothers think their employment is pointless for society. Forty-two percent of Americans, half of them women, say that working mothers have been bad for society on the whole. Nearly all working mothers say they feel guilty as they rush off to work.

Corporations couldn’t have been happier with the situation. Family labor was siphoned off into the industrial labor pool, creating a surplus of workers, which in turn created a cheaper work force. There were still the teeming second-generation immigrant populations available for labor, but there were misgivings about them — those second-generation Russian Jews, Italians, Irish, Polish, and Hungarians, and their like. From the very beginning, they were prone to commie notions such as trade unions and eight-hour workdays. They had a nasty history of tenacity, too.

On the other hand, out there in the country was an endless supply of placid mules, who said, “Yes, Ma’m” and “No, Ma’m”, and accepted whatever you paid them. Best of all, except for churches and the most intimate community groups, these family- and clan-oriented hillbillies were not joiners, especially at some outsiders’ urging. Thus, given the nature of union organizing — urging and convincing folks to join up — local anti-union businessmen and large companies alike had little to fear when it came to pulling in workers from the farms.

Ever since the Depression, some of the placid country mules had been drifting toward the nearest cities anyway. By the 1950s, the flow was again rapidly increasing. Generation after generation couldn’t keep piling up on subsistence farms, lest America come to be one vast Mennonite community, which it wasn’t about to become, attractive as that idea might seem now. Even given America’s historical agrarian resistance to “wage slavery” (and farmers were still calling it that when I was a kid), the promise of a regular paycheck seemed the only choice. We now needed far more money to survive, because we could no longer independently provide for ourselves.

Two back-to-back wars had effectively drained off available manpower to the point where our family farm offered only a fraction of its former sustenance. Even if we tried to raise our own food and make our own clothing out of the patterned multi-colored feed sacks as we had always done, it took more money than ever. […]

By the mid and late 1950s, the escalating monetized economy had rural folks on the ropes. No matter how frugal one was, there was no fighting it. In a county where cash had been scarce from the beginning — though not to disastrous effect — we children would overhear much talk about how this or that aunt or uncle “needs money real bad”. […]

WHEN IT COMES TO MONEY, I AM TOLD THAT BEFORE the war some Unger Store subsistence farmers got by on less than one hundred dollars a year. I cannot imagine that my grandfather ever brought in more than one thousand dollars in any year. Even before the postwar era’s forced commodification of every aspect of American life, at least some money was needed. So some in my family, like many of their neighbors, picked apples seasonally or worked as “hired-on help” for a few weeks in late summer at the many small family-owned apple- and tomato-canning sheds that dotted Morgan County. In the 1930s, 1940s, and 1950s, between farming and sporadic work at the local flour, corn, and feed-grinding outfits, and especially the small canning operations, a family could make it. Pap could grow a few acres of tomatoes for the canneries, and Maw or their kids could work a couple of weeks in them for cash.

This was local and human-scale industry and farming, with the tomatoes being grown on local plots ranging from five to ten acres. Canners depended on nearby farm families for crops and labor, and the farm families depended upon them in turn for cash or its equivalent. […]

Farm-transport vehicles were much scarcer then, especially anything bigger than a quarter-ton pickup truck. So the sight of Jackson Luttrell’s one-ton Chevy truck with its high wooden sideboards was exciting in itself. In those days, farmers did not buy new $45,000 trucks to impress other farmers, or run to the nearest farm supply in one of them to pick up a couple of connector bolts. Every farmer had a farm wagon, whether pulled by horse or tractor, but almost nobody owned a truck. Common sense and thrift prevented them from spending big money on something that would only be used during one month each year at harvest time. Beyond that, farmers would not even think of growing those small acreages of tomatoes that the canneries depended upon if they had to buy a truck to transport them there — any profit made on the tomatoes would be lost on the truck. So, for folks such as Jackson Luttrell, who had one, ownership made more economic sense. He profited through its maximized use in getting everyone else’s crops to the mill or processing plant. One truck served the farm community, at minimum expenditure to the entire group. They didn’t even have to pay Jackson Luttrell any cash for the hauling.

That was because Cotton Unger, who owned the canning operation, was expected to get the tomatoes to his factory himself. As a businessman and entrepreneur, it was Unger’s job to deal with the problems that came with his enterprise. Unger’s job was to run a business; a farmer’s job was to farm. These were two separate things in the days before the rigged game of agri-business put all the cost on the farmers through loading them with debt, and all the profits went to business corporations. Nor did Unger’s duties as a capitalist end with getting the hauling done at his own expense. It was also his job to turn the local crops such as wheat, corn, and tomatoes into money, through milling or canning them for sale to bulk contractors elsewhere.

Cotton owned more than just the family store, which he’d inherited from his father, Peery Unger, and for which the community was named sometime after the Civil War. The store at the junction had gasoline pumps, a grinding mill, and a feed and seed farm-supply adjunct. It was also the official post office for that end of the county; and, just to be safe, Cotton Unger also farmed. The Unger family’s store was a modest, localized example of a vertically integrated, agriculturally based business, mostly out of necessity.

Cotton never saw much cash, and never got rich by any means. Not on the ten-cent and fifteen-cent purchases that farmers made there for over one hundred years. Yet he could pay Jackson Luttrell for the tomato hauling — in credit at the store. That enabled Jackson to buy seed, feed, hardware, fertilizer, tools, and gasoline, and farm until harvest time with very little cash, leaving him with enough to invest in a truck. Unger could run his tomato cannery and transform local produce into cash, because he could barter credit for farm products and services. This was a community economic ecology that blended labor, money, and goods to sustain a modest but satisfactory life for all.

At the same time, like most American businessmen then and today, Cotton Unger was a Republican. He was a man of the Grand Old Party: the party of a liberator named Abraham, who freed millions of black men from the bondage of slavery; and the party of two presidents named George, the second of whom subsequently ushered Americans of all colors back into slavery through national indebtedness. Being of a Republican stripe made Cotton Unger a rare bird in the strongly Democratic Morgan County.

Today he would be even rarer, because he was a Republican with the common wisdom to understand something that no Republican has ever grasped since: he realized that any wealth he might acquire in life was due not only to his own efforts, but also to the efforts of all other men combined — men who built the roads that hauled his merchandise; men who laid rail track, grew crops, drilled wells, and undertook all the other earthly labors that make society possible. Whether they were Democrats or not, he needed the other citizens around him as friends, neighbors, and builders of the community. To that end, he provided transportation to the polls at election time for farmers without cars — and they were many, Pap and Maw among them — full knowing that nearly every last one of them was going to vote against his candidate. In his ancestors’ time they had voted for Andrew Jackson, Martin Van Buren, James Polk, James Buchanan, Woodrow Wilson, Franklin Roosevelt, and Harry Truman — all Democrats.

The old-timers say that Cotton always looked kinda weary around election time. And well he must have been. On election day, Cotton chauffeured around Democratic voters, people who would vote against his interests, vote in favor of higher business taxes or to increase teachers’ pay to the point where the school-marm could almost make a living. But Cotton also understood that his personal interests resided more with his community and neighbors than with his political affiliation. Republican politicians in faraway Charleston took the back seat to his face-to-face daily life with his neighbors. Cotton, like his father Peery, and his grandfather, C.J. Unger, before him, knew that when you depend directly on neighbors for your daily bread, you’d damned-well better have their respect and goodwill. And you’d best maintain it over generations, too, if you plan to pass the family store down to your sons and your sons’ sons. We may never see that level of operative community democracy again.

pp. 61-69

Not that money was unimportant. Money has been important since the first Sumerian decided it was easier to carry a pocket full of barley shekels than hump a four-foot urn of barley down to the marketplace on his back. And it was certainly important 5,000 years later to the West Virginia hill country’s subsistence farmers. But in the big picture, money was secondary to co-operation and the willingness to work hard. A considered ecology of family labor, frugality, and their interrelationship with community was the economy. And the economy was synonymous with their way of life, even though that would have been a pretentious term to Pap and his contemporaries. He always said, “You just do the next thing that needs doing. You keep doing that, and everything gets done that needs to be done.” When I’d ask him what to do next, he’d say, “Just look to see what needs doing, dammit!”

Understanding what needed doing was the glue of subsistence farming’s family-work ecology, which was also ecological in the environmental sense. Knowledge was passed along about which fields best grew what produce, the best practices to maintain fertility, and what the farm could sustainably produce year in and year out. It was a family act.

Those farm families strung out along Shanghai Road could never have imagined our existential problems or the environmental damage we now face. But, after having suffered such things as erosion from their own damaging early-American practices, they came to understand that nature and man do not stand separately. The mindfulness involved in human-scale farming demands such. To paraphrase Wendell Berry, we should understand our environmental problem as a kind of damage that has also been done to humans. In all likelihood, there is no solution for environmental destruction that does not first require a healing of the damage done to the human community. And most of that damage to the human world has been done through work, our jobs, and the world of money. Acknowledging such things about our destructive system requires honesty about what is all around us, and an intellectual conscience. And asking ourselves, “Who are we as a people?”

Meanwhile, as settlers migrated down the Great Valley of Virginia, as they called the Shenandoah Valley toward the fertile southlands, the poorer among them kept seeping westward into the uncleared Blue Ridge, where land was cheapest and work was hardest. When they settled on Fairfax’s land, they may have become human assets to his holdings. But they were not slaves and they were not employees. The overwhelming portion of the fruits of their labor were directly their own. They could not be fired. They could not incur oppressive financial debt. And if their farms were isolated specks in the blue Appalachian fog with their split-pine log floors, they were nevertheless specks located in a great, shared commons called nature.

In contrast to Fairfax and the planter society’s money-based economy of wealth, these settlers lived by a family-based economy of labor. Not that they had a choice. Any kind of coinage or currency was rare throughout the colonies. Their economy depended on the bartering of labor and sometimes goods between themselves. Dr Warren Hofstra, an eminent historian of the area, tells me this system was so complex that they kept sharply detailed ledger books of goods and services bartered, even of small favors done for one another. In essence, this was an economy whose currency was the human calorie. Be it a basket of apples or a week’s labor hauling stone for a house, everything produced (which was everything in their subsistence world, there being no money), was accomplished by an expenditure of human energy. Calories burned could only be replaced by an expenditure of calories to plant, grow, and preserve future calories for sustained sustenance. This was a chain of caloric expenditures or barter going all the way back to the forging of the iron hoe or plow that made subsistence possible at all. Keenly aware that both time and their own human energy were finite, they measured, balanced, and assigned value to nearly every effort, large or small. Wasting these resources could spell hunger or failure to subsist.

This attitude lives on today among the descendants of the settlers. When outsiders move into this area, they often comment on what they perceive as the miserliness of the natives. Or the fact that they will not let you do them even a small favor, lest they be obligated in return.

A lady new to the area, a physician who hails from Delaware, told me: “I went shopping with Anna at the mall last week. We went in my car. She tried to give me three dollars for ‘gas money’. I told her that was very kind, but we’d only driven two miles at best and that it wasn’t necessary. She kept pushing the money at me, saying ‘Here, take this,’ getting more and more insistent each time. I kept declining until I noticed that she was becoming honestly and truly angry with me. It was so damned strange, I’ve never seen anything like it. So I took the three dollars.”

I explained that many natives are like that, and told her about the early settlers’ rigid barter-and-favor economy, and how these attitudes have unconsciously come down through our cultural history, remaining as deeply instilled social practices and conventions. It can work the other way around, too. Some people will unexpectedly do something very nice for you, or give you something — maybe an antique or whatever.

“Don’t let the Southern charm fool you, though,” I said. “In the back of their mind they have marked it down as a favor or a social debt owed. And they’ll expect you to recognize when to pay it back. Maybe volunteer to feed their dog or water their lawn when they are away. At the same time, you should feel somewhat honored. It’s a down payment on developing further friendship. If they hadn’t judged you to be a worthy, reliable, and reciprocating person, dependable in a friendship, they wouldn’t even bother to know you at all. In fact, that’s why so many outsiders perceive some natives as snotty and cold.”

“Amazing,” she said. “I’d never guess their behavior had such deep cultural roots.”

“Neither would they,” I replied.

As the hill-country population grew, their isolation lessened. Farmers grew more connected in a community network of seasonal mutual efforts, such as threshing, hunting, hog slaughtering, haymaking, clannish marriages, and birth, burial, and worship. These conventions were still being observed into the 1950s as I was growing up there.

Family and community life in that early, non-wealth-based economy is impossible for us to comprehend. No man can fully grasp a life he has not lived, or for that matter completely grasp the one he is living. But we Blue Ridge folk most surely live subject to the continuing effects of that dead culture which is never really dead.

For example, the old agrarian culture of reserve, frugality, and thought-out productivity translate as political conservatism today, even though few of its practitioners could identify a baling hook if their lives depended on it. At its core stood — and still stand, for the most part — “family values”, which meant (duh!) valuing family. Valuing family above all else, except perhaps God’s word. Grasping the true meaning of this is to understand much of the conservative American character, both its good and its bad qualities. I dare say it also holds some solutions to the dissolution of human community, the destabilizing of world resources, and the loss of the great commons, human and natural, all sacrificed to the monstrous fetish of commodities, their acquisition and their production through an insane scale of work and round-the-clock commerce and busyness.

Rate of Moral Panic

I’m always looking for historical background that puts our present situation in new light. We often don’t realize, for example, how different was the world before and after the Second World War. The 1940s and 1950s was a strange time.

There was a brief moment around the mid-century when the number of marriages shot up and people married younger. So, when we compare marriage rates now to those in the post-war period, we get a skewed perspective because that post-war period was extremely abnormal by historical standards (Ana Swanson, 144 years of marriage and divorce in the United States, in one chart). It’s true that marriage rates never returned to the level of that brief marriage (and birth) boom following the war, but then again marriage rates weren’t ever that high earlier either.

In the 1990s, during the height of the culture wars when family values were supposedly under attack, the marriage rate was about the same as it was from before the Civil War and into the early 1900s, the period I’ve referred to as the crisis of identity. In the decades immediately before that starting around 1970, the marriage rate had been even higher than what was seen in the late 19th century (there isn’t dependable earlier data). Nor is it that premarital sex has become normalized over time, as young people have always had sex: “leaving out the even lower teen sex rate of GenZ, there isn’t a massive difference between the teen sex rates of Millennials and that of Boomers and Silents” (Rates of Young Sluts).

As another example from this past century, “In 1920, 43 percent of Americans were members of a church; by 1960, that figure had jumped to 63 percent” (Alex Morris, False Idol — Why the Christian Right Worships Donald Trump). Think about that. Most Americans, in the early 1900s, were some combination of unchurched and non-religious or otherwise religiously uninvolved and disinterested. A similar pattern was seen in the colonial era when many people lived in communities that lacked a church. Church membership didn’t begin to rise until the 1800s and apparently declined again with mass urbanization and early industrialization.

By the way, that is closely associated with the issue of marriage. Consider early America when premarital sex was so common that a large percentage of women got married after pregnancy and many of those marriages were common law, meaning that couples were simply living together. Moral norms were an informal affair that, if and when enforced, came form neighbors and not religious authority figures. Those moral norms were generous enough to allow the commonality of bastards and single parents, although some of that was explained by other issues such as rape and spousal death.

Many early Americans rarely saw a minister, outside of itinerant preachers who occasionally passed by. This is partly why formal marriages were less common. “Historians of American religion have long noted that the colonies did not exude universal piety. There was a general agreement that in the colonial period no more than 10-20 percent of the population actually belonged to a church” (Roger Finke & Rodney Stark, The Churching of America). This was at a time when many governments had state religions and so churches were associated with oppressiveness, as seen with the rise of non-Christian views (agnosticism, atheism, deism, universalism, unitarianism, etc) during the revolutionary period.

And don’t get me started on abortion, in how maybe as high as one in five or six pregnancies were aborted right before the American Civil War. That might be related to why fertility rates have been steadily dropping for centuries: “Extending the analysis back further, the White fertility rate declined from 7.04 in 1800 to 5.42 in 1850, to 3.56 in 1900, and 2.98 in 1950. Thus, the White fertility declined for nearly all of American history but may have bottomed out in the 1980s. Black fertility has also been declining for well over 150 years, but it may very well continue to do so in the coming decades” (Ideas and Data, Sex, Marriage, and Children: Trends Among Millennial Women).

Are we to blame commie liberal hippies traveling back in time to cause the decline of America practically before the country was even founded? Nostalgia is a fantasy and, interestingly, it is also a disease. The world is getting worse in some ways, but the main problems we face are real world crises such as climate change, not namby pamby cultural paranoia and fear-mongering. The fate of humanity does not rest on promoting the birth rate of native-born American WASPs nor on the hope that theocracy will save us. If we want to worry about doom, we should be looking at whether the rate of moral panic is experiencing an uptick, something that often precedes the rise of authoritarian mass violence.

Past Views On One Meal A Day (OMAD)

“Eating once a day is angelic, twice a day human, and three, four or more times is bestial.”
~Le Menagier de Paris, The Parisian Household Book, 1393

“Oru velai sapta yogi (if you eat once you’re a yogi);
Rendu velai sapta bogi (if you eat twice you’re a hedonist);
Moonu velai sapta rogi (if you eat thrice you’re a patient).”
~Traditional Tamil wisdom

“And there are men to be found who take but one meal a day, and yet remain quite healthy. The elder Fowler, the phrenologist, is one of them. Such, too, in past years, were Talleyrand of France, and Mr. Taliaferro of Virginia. It is even stated that some of the old Romans ate but one meal a day. Seneca, though worth an estate of $15,000,000, taught the doctrine, and, as it is said, practised it.”
~William Andrus Alcott, 1859

The Laws of Health:
Or, Sequel to The House I Live In

by William Andrus Alcott


636. The question, how often we should eat, has been much agitated, especially within a few years ; and with various results. In general, however, there is a belief that we eat too often, and that a deduction from the number of our meals might very profitably be made. Many incline to the opinion that two meals a day for healthy adults are quite sufficient. A few go farther still, and teach that nature’s purposes are best answered by only one.

637. This subject, like most others pertaining to a connection with the appetite, has been hitherto approached in a wrong way. For, since nature, perverted as she is, ever tends to excess, the great practical question in all these matters should be, not how much we may gratify ourselves without any evil results, but how little gratification will best accord with our usefulness. Instead of inquiring how near the edge of a precipice we can go without falling from it, we should seek to keep at the greatest practicable distance. The proper question is not, Which is the worst or most dangerous road? but, Which is the best?

638. In the present instance, the true physiological inquiry should be, What is the least number of daily meals which will best answer nature’s purposes? What number will preserve us in the most healthy condition, and at the same time give us the firmest appetite, and, in the aggregate, the most pleasure? The true question is not, How often can we eat and not get sick immediately? And yet, more than this, I say, is very seldom asked.

639. Although it should be our first and highest aim to do what is best and most according to truth in all things which concern our appetites, yet we can never keep pleasure entirely out of sight; nor is it the Divine intention that we should. God has kindly united duty, interest, and pleasure; and what he has joined together should not be sundered.

640. There can be little doubt that, the more frequently we eat, the less, as a general rule, we enjoy. At present, it is customary to eat so often that we seldom, if ever, reach the point of having a good appetite; and what of appetite we have, at first, is soon spoiled. The less frequently we eat, on the contrary, even to the comparatively narrow limits of once a day, the more we enjoy.

641. But observe, if you please, I do not say God has united with our duty the highest possible degrees of immediate pleasure, but only the greatest amount in the end. There is room enough left for self-denial, or what is usually called by that name ; by which I mean, a denial of present pleasure, at least in part, for the sake of pleasure in the distance, which is greater in the aggregate.

642. There are certain physiological considerations which aid us in determining how often we should eat ; or, rather, in deter mining how often we should not eat. We have seen (551) that the process of chymification is forwarded, in no small degree, by a species of muscular motion which has a slight resemblance to the churning process among dairy-women.

643. This churning muscular motion generally continues till the stomach is cleared of its contents; i.e., till all, or nearly all, has passed out at its pyloric orifice. The time required for this varies, in the adult, from two or three to four or five hours. (558.) In children, the process, like those of breathing and circulation, is more rapid.*

644. Now, it is a law with all voluntary or willing muscular parts of the body, that they shall have their seasons of rest. But the heart is muscular, and there are muscles in the walls of the thorax to aid in moving the lungs; and then, as we have seen, the stomach is muscular. None of these, it is true, are voluntary or willing muscles. Their motion takes place with out our having much to do with it, directly.

645. Still, it is true, most undeniably true, that these parts need rest. The muscular parts of the heart and lungs have their intervals of rest, though they are short; and is not this the plainest proof that they need it? The muscular parts of the stomach, in all probability, come under the same necessity. Sometimes they obtain this rest; at others they do not. But I have spoken on this subject before. (120-122.)

646. When we breakfast at six, take a lunch at nine or ten, dine at twelve, take another lunch at three, and eat a heavy supper at six, the stomach probably has no rest during the day, and, in consequence, is so much fatigued at night, that the load which is imposed on it at six is not wholly cast off during the night, and we rise in the morning to go again the same round, and with similar results.

647. Then, again, when we rise at seven, breakfast at eight, take a lunch at eleven, or twelve, as in fashionable life, dine at two, take tea at five, and a heavy lunch of the most heavy of all indigestibles at nine or ten, we come to the hour of rest, as before, with a jaded stomach; and in due preparation for a restless and distempered night.

648. And the reward we have so richly earned is sure to be received. Our sleep is too sound on the one hand, or too much disturbed on the other. The latter result is most frequent. We toss out the night in distressing dreams, and wake the next morning to a bad taste in the mouth, a dryness of the throat, a dull headache and loss of appetite, and an unwillingness to rise, except from the most pressing necessity.

649. Such a course of life, persisted in for weeks, months, or years, will bring about, in most persons, a bad state of things in the alimentary canal, which, in its sympathies or effects, some times extends to other parts of the system. Many a tooth-ache, ear-ache, head-ache, and neuralgic attack, and not a few cold feet and sour stomachs, may be fairly charged to the errors of which I have here spoken.

650. Children, no doubt, should eat much more frequently than adults. True, their stomachs are not so strong, nor their digestive powers, though they are generally more active. But even our children eat too often, in most instances. They are trained to it from the very first. Some of them seem to be almost always eating, from morning to night. Little infants, in most instances, are even nursed or fed in the night. And the penalty is but too well known. Half of them, or nearly half, die under ten years of age; and this is one of the causes.’

561. The healthy adult who eats but three times a day, and this at regular intervals of about six hours, gives his stomach a little time for rest; and may hope to proceed on in the journey of life, at least a short time, without disease. He may indulge this hope, I mean, if other things are as they should be.

652. But three meals a day for an adult, whatever may be his habits or circumstances, — except in the rare case of some peculiar disease, — is the maximum number which is admissible. It is running as much risk as we can with safety. It is going as near the edge of the precipice as we can and not fall from it, instead of taking the highest and safest and best road!

653. They who take but two meals a day, especially during the short days of winter, not only give their digestive powers — their stomachs in particular — more time for rest, but actually enjoy more, and find themselves in better general health. Of this habit we have many eminent living examples. In this case the first meal might be profitably taken at ten o’clock in the forenoon, and the second at four in the afternoon.

654. And there are men to be found who take but one meal a day, and yet remain quite healthy. The elder Fowler, the phrenologist, is one of them. Such, too, in past years, were Talleyrand of France, and Mr. Taliaferro of Virginia. It is even stated that some of the old Romans ate but one meal a day. Seneca, though worth an estate of $15,000,000, taught the doctrine, and, as it is said, practised it.

655. It is even told of, Mr. Taliaferro, that he went still farther. When by any unavoidable circumstance he was unable to dine at his usual hour of the day, he deferred it to the next day. This was to eat only once in two days. But this course I think an error. Once a day is the minimum or smallest needful number of our meals.

656. On this point, however, I wish to be understood. I do not say, positively, that three meals a day are incompatible with the maintenance of tolerable health; nor that one a day is sufficient. But I do say that more than three are injurious ; that two would for most persons be preferable to three; and that one for most people may after all be found adequate to every purpose. Indeed, I am inclined to think it would be so.

657. They who take but one meal a day secure at least one important point, that of having always a good appetite. At least they gain this point provided they do not eat too much at this one meal. Most persons, as we have seen, eat so often that they never know what a good appetite is. They always eat before they are truly hungry, in a physiological sense; and hence know neither the blessing of a good appetite or of true gustatory enjoyment.

658. They remind me of a half-idiot, whom I knew in early life, who was always pressing the question, ” Don’t you wish to know the art of never being dry ? “that is, thirsty. ” Always mind to drink before you are dry,” he added, “and you will never be dry.” We have most of us already made a faithful application of the fool’s rule to our eating. We eat always be fore we are hungry, and hence are never hungry.

[Questions. — Is there not a general belief abroad that we eat too often! Have we arrived, as yet, at a settled opinion on this subject?]


659. In the last section I was obliged to encroach a little on the topic assigned to this. I was obliged to allude to the evils of eating too often; and this of course involved the subject of eating between our meals, or, as it is called, of taking lunches or luncheons. But I have not yet said all that the case requires. Eating between our regular meals is a dietetic transgression of no ordinary magnitude.

660. Whether we eat once, twice, thrice, or ten times a day, we should stop with our regular meals, Nothing containing nutriment, whether in a solid or liquid condition, should go down our throats between our meals, except water. To this rule, so far as the healthy are concerned, I know of no exception.

661. May we not eat an apple, it will be asked, or a little fruit, of such kinds as we happen to meet with, or a few nuts? Must we go without all these things, which the kind hand of the great Creator has scattered all along our path — probably not in vain? Would we not be even ungrateful to him, did we do so?

662. No doubt that these things, for the most part, are made to be eaten, either by us or the other animals, or both. But they should be brought to our tables, and, without exception, made a regular part of our meals. Not indeed at the end, after we have eaten enough of something else; nor yet at the beginning, merely to excite an appetite for other food. They should be eaten, as the potato usually is, as a part of our meal.

[Have we not studied the subject in a wrong manner? What is a better way? What should be the true inquiry in prosecuting the study of hygiene? In our inquiries is pleasure to be overlooked, entirely so? Why not? Is our enjoyment in eating in proportion always to the number of our meals? Is he the greatest gainer in point of mere pleasure in eating, who gets the most pleasure immediately?

What are we to infer, in this particular, from the muscular character of the stomach? How may we eat so as to give the stomach and other digestive organs no rest? What are the frequent evidences of abuse during the previous day? What diseases may ensue? Should children eat oftener than adults? What is said, in particular, of the effects of eating three meals a day? What of eating two only? What of eating but one? Are there some eminent examples in both these latter kinds? To what extreme did Mr. Taliaferro go? Who are they that always have a good appetite? What anecdote is related of a certain idiot? What is the application?]

663. It may perhaps be said that our ancestors — puritannical though they were — accustomed themselves not only to lunches in the forenoon and afternoon, but to nuts and cider or apples and cider in the evening, and yet were a healthier people, by far, than their more squeamish descendants; and there will be no want of truth as the basis of the remark.

664. But, remember, that if they were more healthy than we, then we, of course, are less healthy than they. How came we thus? Is it a matter of chance, or hap-hazard? Do these things spring out of the ground? Is there not a cause for every effect? Do we not inherit a deteriorated and deteriorating constitution?

665. Besides, our fathers and grandfathers set out with better constitutions than we, so that, whatever may have been the cause of their better or our inferior stamina, they could most certainly bear up longer under violations of physical law than we, their descendants. It does not then follow, as a necessary inference, that we may eat lunches because they did.

666. May we not take nourishing drinks between our regular meals, such as milk and water, molasses and water, and bread coffee? some will ask. Not a drop. Better, by far, to eat a piece of dry bread; for that will be masticated. But you do not want either. The sediment of nutritious drinks (561) is one of the hardest ordinary things the stomach has to contend with. It is, moreover, a curious fact that a piece of dry bread, well chewed, will often quench thirst better than any liquid, even water. But, I repeat, I do not recommend even that.

667. Anything that contains nutriment must, of course, set the stomach and other digestive organs at work, more or less; even if it is nothing but a strawberry, or a lump of gum or sugar, or some aromatic seeds. I do not say or believe that it takes as long, or tasks the digestive machinery as severely, to work up a lump of sugar or a strawberry into chyle, as a full meal; but I do say that the whole process of digestion, complicated as it is, must be gone through with.

668. Many, who have listened patiently to remarks like these, have at length exclaimed, with some surprise: “But what is the laboring man to do, especially in the long hot days of haying and harvesting, without something to sustain him be tween his meals? You proscribe stimulating drink, and very properly; but what will you propose as a substitute? He, would faint away without something. Or, if he should not faint, there would often be a gnawing at the stomach, which would be insupportable.”

669. It should be distinctly known to everybody, that neither the faintness nor the gnawing here spoken of, indicate any real hunger. They are mere nervous sensations. They indicate, moreover, a diseased condition of the nerves. If any one doubts, let him but make the following experiment. The writer has made it for himself, and that repeatedly.

670. While your fellow-laborers are removing, for the time, their gnawing and faintness by a lunch, just seat yourself at their side, and, instead of adding a new load to the already overloaded and sympathizing stomach, drink slowly a small quantity of pure water, tell a story or hear one, and, if you can, excite a little the risible faculties; and when they return to their labor, join them, as before. Pursue this course a few days, or a few weeks, and see who endures it best, and com plains most of gnawing and faintness.

671. It is no uncommon thing to hear farmers telling how glad they are to be through with their haying and harvesting. But it is they who use lunches, or take other means beyond their regular meals for restoring themselves temporarily at the expense of the future, who complain most. He who eats of plain food twice or three times a day, and drinks nothing but water, endures best the heat and fatigue, and suffers least from gnawing and faintness.

672. Young men in groceries, eating-houses, and inns, as well as clerks in public offices, and in shops and factories, often injure their health very much by a foolish acquired habit of tasting various things which are constantly before them, such as fruits, nuts, confectionery, sugar, dried fish, cordials, etc. Clerks, in addition to all this, sometimes eat wafers.’

673. It is but a few days since I saw a young man about thirty years of age, of giant constitution by inheritance, who was suffering severely in his digestive machinery from the very cause, by his own voluntary confession, of which I am now speaking. And I have before my mind’s eye the painful history of a young man whom I twice cured of dyspepsia from this same cause, but who afterwards went beyond my reach, and fell a victim to it.

674. Perhaps the worst violation of the law which forbids eating between meals, is found in the wretched habit of the young, of eating what are called oyster suppers, at late hours and at improper places. Our cities, and sometimes our large towns, abound with places of resort for those who will not deny their appetites; and it is not surprising that they so often prove, not only a pathway to the grave, but as Solomon says, to hell.


675. There are to be found, among us, a few strong men and women — the remnant of a by-gone generation, much healthier than our own — who can eat at random, as the savages do, and yet last on, as here and there a savage does, to very advanced years. But these random-shot eaters are, at most, but exceptions to the general rule, which requires regularity.

676. For very few things, I am quite sure, can be more obvious to the most careless observer, than that those individuals who are most regular in regard to eating, other things and circumstances being equal, are the most healthy. And, what is of very great importance, too, any one who will take the trouble may soon satisfy himself that it is these regular men and women whose children inherit the best constitutions.

677. I have, indeed, admitted that we are so far the creatures of habit that we can accustom ourselves to almost any hours for eating, and to one, two, three, or more meals a day, as well as to many other things which are generally regarded as objectionable; and yet not suffer much, immediately. But I have also shown and insisted that this does not prove we are wise in forming these habits. We must look a little way into the future, and have regard to the good of the race, as well as to our own present gratification or happiness.

678. It is often said that since the conditions of civic life require occasional irregularities, it is desirable to accustom our selves to such irregularities, betimes. For, if we do not, it is still insisted, we shall be liable, at times, to such derangement and disturbance in our systems, from unavoidable changes, as might subject us to a long and perhaps severe fit of sickness.

[Questions. — Is eating between our meals a light transgression? Should nothing which contains nutriment be swallowed between meals? May we not eat fruits? Why not, if the fruits are made to be eaten? Our ancestors ate lunches; why may not we? What is said of milk and water, molasses and water, etc., between meals? Must the whole work of digestion be gone through with, when we eat but a single nut, or a strawberry? May not the hard laborer have lunches? What then shall we do, when gnawing and faintness arise? Have these sensations nothing to do with real hunger? What experiment is proposed? To what dangers are young men sometimes exposed in groceries, shops, eating-houses, public offices, etc.? Are they apt to yield to the temptations? What case is related by the author? What still more striking case came under his observation? What is the worst violation of the rule for infrequent eating?]

679. This reasoning, by way of objection to the doctrine of regularity in our habits, is certainly specious. The great difficulty with it is, that it is practically untrue. For few things can be more easily shown than that they whose digestive systems hold out best, are precisely those who are most regular in their habits of eating, drinking, etc.

680. It is indeed true that such persons, when subjected to the supposed necessary irregularities of civic life, above alluded to, may be subjected, at times, to a little temporary disturbance, but it quickly passes away. Does not this prove the general integrity of the digestive function? No condition of the human stomach is more to be dreaded than that unresisting state which permits us to make it a complete scavenger for the time; while the abuse awakens slowly, in some remoter part of the human confederacy, a terrible insurrection, and still more terrible retribution.

681. I knew a physician who, at home and abroad, with others, and especially with himself, passed for a wise man. Yet, unable to resist the temptations incident to the life of a country medical practitioner, he gradually fell into the utmost irregularities about his meals. For his morning meal he had no appetite; at the dinner hour he was among his patients, eating at any hour convenient; or, oftener still, refusing to eat at all.

682. On returning to his family, — often late at evening, — , his faithful wife, who knew his habits and expectations, was accustomed to prepare for him as rich and as abundant a meal as possible, of which he almost always partook in excess. But the penalty of his trangression was fearful. Disease, painful and harassing, early followed; and, though blessed with an “iron constitution ” by birthright, he sunk into the grave at sixty-five.

683. The history of this man is, in substance, that of thousands. I have myself witnessed twenty years of the most in tense anguish, ended by a premature and terrible death, which was the obvious result of physical disobedience. The penalty, it has repeatedly been said, does not always fall directly on the suffering organ or function, but sometimes on a part in sympathy with it.

684. It may, to many, seem strange, but it is nevertheless a fact, that they who are most regular with regard to their habits of eating, — whether as it regards times of eating, quality of the food, or quantity, — are the very persons who suffer least, as a permanent thing, when compelled to occasional changes or interruptions of their accustomed habits. Or, if they suffer, the suffering is but temporary. Their stomachs are stomachs of integrity, and their promptitude in meting out justice, and putting to rights injurious tendencies, is as striking as their integrity.

685. Locke, the philosopher, has somewhere told us that when a child asks for food at any other time than at his regular meals, plain bread should be given him — no pastry, no delicacies, but simply plain bread. If the child is really hungry, he says, plain bread will go down; if not, let him go with out till he is so.

686. But why give him anything at all between his regular meals? These, to be sure, should be somewhat more frequent than our own; but this is not to make concessions to irregularity. Is it not truly marvellous to find the best of men — those who in many things have thought for themselves — still yielding to authority when arrayed against the plainest good sense?

687. It is very unfortunate for human health and happiness that the young should be trained from the very first — and to a most lamentable extent — in the way in which they should not go. They are very tenacious of life, — are made to live, — and yet, presuming on their known tenacity of life, we only make them the greater sufferers on account of it. I have known many a child, swept away by summer and autumnal diseases, who, but for his past irregularities in eating, might very probably have escaped.

688. That to train up a child in the way he should go, in every particular, is exceedingly difficult, every parent, master, or guardian well knows. Forbidden trees, on which hang curses, beset everywhere the path of human life, especially that broader division of it which, alas! so many of us travel. How to have our children escape all pitfalls and dangers, — how, even, to escape them ourselves, — is a question not by any means easy of solution ; but its importance is at* the least equal to its difficulties.

689. I wish the young could fully understand that every time they depart from their accustomed usages, and, during the intervals of their meals (be the latter few or many), venture on a little fruit, a little candy, a little confectionery, etc., they are not only impairing their appetite, and contaminating their blood, but impairing the tone of their digestive system, and deranging the action, more or less, of the whole alimentary canal.

690. Every well-directed effort to invigorate the alimentary canal, and increase the tone of that and the greater internal surface of the lungs, is richly repaid in future hardihood and health; while every neglect, or disregard — everything disloyal to the calls and demands of Nature’s conservator — is repaid in near or remote suffering, and perhaps transmitted to yet unborn generations.


691. Nothing is more common than the remark that the greatest dietetic error is with regard to quantity. It is admitted that we often err, as regards quality; that we eat irregularly; and that we eat too fast. And yet the great practical error, after all, we are told, is, that we eat too much.

692. There is truth in the remark, as the subject must necessarily be viewed by those whose standard of hygiene is still low. And yet, bad as excessive alimentation may be, it is but the natural — I had almost said necessary — result of certain errors lying back of it. If the quality of our food, and the modes of preparing and receiving it, and the moral tendencies of our nature, were such, from the very first, as they ought to be, there would be comparatively little among us of excess.

693. The common doctrine of intelligent men is, that we eat about twice as much as nature’s best purposes require. Philosophers, physiologists, chemists, pathologists, dietiticians, and even many of the unenlightened, all agree in this. Not of course that every individual eats twice as much as he ought; but that, as a people, here in the United States, this is true.

694. Most persons, it would seem, eat just about as much as they can and not suffer from it immediately. The inquiry with most who inquire at all, is not how little is best for them and how much they can save, beyond this measure, for “him who needeth”; but how much they can consume, without loss of health or character as the consequence.

[Questions. — What is said of certain random-eaters among us? Are they whose habits of eating are most regular, usually the most healthy? Must we have regard, in the formation of our habits, to the good of our race? What very specious objection is sometimes made to these views and doctrines? Why is it unsound? Relate the anecdote of a medical man, and tell me what it is designed to prove. Is this man’s history substantially that of thousand ? What has the philosopher Locke said? Wherein is he mistaken? What is there especially unfortunate in an early training? Do all our dietetic errors, especially our irregularities in regard to eating, tend to derange the action and motion of the alimentary canal? What important hints does this afford in the education of the young? What equally important hints does it afford to the self-educated?]

695. In truth, the declaration of eighteen hundred years ago, that all seek their own, not another’s (or others’) good, covers the whole ground. To get good and apply it to the gratification of our own propensities, whatever may become of others, is fallen nature’s great law, As John Foster has well said, this not caring for others is the very essence of human depravity.

696. It is frequently asked how much we should eat; and some are unsatisfied till we put in requisition the scales, and tell them exactly how many pounds or ounces they must take, daily. I have even dined, in the city of Boston, with a man otherwise respectable, who had his scales on the table, and proceeded to weigh out, before me, his dinner.

697. Of course I do not intend to question the propriety or the usefulness of weighing out our food, at least, occasionally. Experiments of weighing food, made by scientific or thinking men, for scientific or practical purposes, might be made — no doubt sometimes are made — quite useful.

698. Thus, in experiments made in Glasgow, in Scotland, on laborers, who, from their increased expenditure during their exercises, are very naturally supposed to require as large a supply of food as any other class of men, it has been found that two pounds of good bread, daily, or six pounds of good potatoes, (which in point of nutriment are deemed about equal to two pounds of bread,) is the largest quantity demanded or required.

699. President Hitchcock, late of Amherst College, and Mr. Graham, have taught that the average quantity of nutriment which the best development and support of the body require, is somewhat less than this. They, too, have made their conclusions from observation and experiment. The former would reduce the British standard quantity about one-fourth; the latter, nearly one-half.

700. Much allowance, in this matter, must be made for early training, as will be seen in the next section. I once had the pleasure of sustaining, at college, a most deserving young man, who could not get along, as he believed, without two pounds of bread, or its equivalent, daily. But he had been trained to excess; and for the time seemed to demand it. However, he exhausted his physical capital in a few years, and died bankrupt!

701. Are there, then, you may be disposed to ask, no specific rules for the individual, about quantity? Must we gather up, from abstract or general principles and from facts, a code for ourselves? Like the new-fledged arithmetician at school, must we make our own rules? Is experience in dietetics every thing, and science nothing?

702. Not quite so fast. I have given you the deductions of science already. It has determined, no less surely than experience, that we eat too much. It has told us what is the maximum quantity required. What the minimum or smallest quantity we really need is, we have not yet inquired. And most persons do not choose to make the inquiry, lest they should have to resist, a little, their propensities.

703. To those who have moral courage enough — in other and better words, enough of Christian philosophy — to dare to make the inquiry, a few rules may be given which will enable them to approximate towards the truth in the case, by seeking an answer to the inquiry: How little can we get along with, and at the same time best discharge all our duties and secure all lawful and proper interests?

704. We have been taught, in time past, to leave off hungry; or, as some express it, with a good appetite. Or, as others still, are wont to say, we have been told never to eat quite enough. The rule is a good one, as far as it goes. I have known a few who partly observed it; and they believe they owe to this partial obedience their health and life.

705. Thus, Grant Thorburn, whose writings, over the signature of Laurie Todd, have interested and delighted many, and who, at the age of ninety, or nearly so, is almost as young in his feelings as ever he was, is accustomed to say to his friends that he never ate enough in his whole life.

706. Early in the year 1852, I called to see a man in Ohio, who was eighty-seven years of age. It was one of the severest days of a most severe winter. He was in the woods, at work, for he was a farmer; but he soon came home. Surprised at his power to labor and endure the cold, I inquired about his habits; and, among other things, asked him about the quantity of his food. His answer included just such a statement as that of Mr. Thorburn.

707. Cases of this kind might be multiplied, not, however, to an indefinite extent; for, most unhappily, the world as yet does not abound with them. I will only add to the list, at present, John “Williams, a Baptist minister of Rhode Island, who died at the age of one hundred years or more, and myself.

708. It is quite possible to err, however, under this rule. A person who bolts his food will eat much more without reaching the point of satiety than one who does not. While, therefore, he who bolts food has not reached the stopping-place, so far as he knows, another who masticates well has reached it with far less food. The former may therefore eat too much and yet leave off hungry.

709. It is a better rule still, to eat no longer than the food appears to refresh us, bodily and mentally. This rule, I grant, is liable to the same difficulties with the preceding, nevertheless, it restricts us more. For even Grant Thorburn, who never eats enough, may possibly sometimes eat so long as to become dull in body or mind as the result. I am not without doubt whether he and my Ohio friend always leave off their meal with feelings of merriment, and with a disposition to dance and sing, like children. Yet such, as I believe, should be the effect of our eating. Its main object, I grant, is to secure nourishment for a future hour; but it has a secondary object, too, which is refreshment and gratification.

710. It is recorded of President Jefferson, that he was accustomed to remark that no man, when he comes to die, ever repents of having eaten so little. This remark would be worth more if it were true that men are apt to repent of eating too much. But the truth is, we seldom exercise any genuine repentance at all when we come to die, unless we have begun the work before. Death-beds are not the very honest places some have supposed. Men generally die as they live.

711. The early travellers among the Japanese tell us that a native of that country, especially of the interior, will work all day long on a mere handful of rice and a little fruit. Yet the Japanese are among the stoutest and strongest men of Asia ; and for size and strength almost resemble the German, the Swiss, and the Yankee. Can it be that they suffer for want of food?

712. We come back, then, from our reasonings and facts to the point whence we started, viz., to the affirmation that we generally eat twice as much as we ought, and that retrenchment is loudly and imperiously demanded. Few err on the other side. Inclination, habit, refined cookery, and the customs of society are all against it.

713. I have admitted that the laborer, as a general rule, requires more food than other men, because his expenditure is greater. Yet it does not thence follow, that he who performs two days’ work in one, and who consequently overworks him self, should eat in the same proportion, that is, twice as much. Generally speaking, if he really overworks, he should eat some what less, since the same causes which have overtasked and crippled his general system must have reduced the energies of his digestive system in the same proportion.

Ancient Greek View on Olive Oil as Part of the Healthy Mediterranean Diet

“I may instance olive oil, which is mischievous to all plants, and generally most injurious to the hair of every animal with the exception of man, but beneficial to human hair and to the human body generally; and even in this application (so various and changeable is the nature of the benefit), that which is the greatest good to the outward parts of a man, is a very great evil to his inward parts: and for this reason physicians always forbid their patients the use of oil in their food, except in very small quantities, just enough to extinguish the disagreeable sensation of smell in meats and sauces.
~Socrates dialogue with Protagoras

So what did ancient Greeks most often use for cooking? They preferred animal fat, most likely lard. Pigs have a much higher amount of fat than most other animals. And pigs are easy to raise under almost any conditions: cold and hot, fields and forests, plains and mountains, mainlands and islands. Because of this, lard is one of the few common features in traditional societies, including the longest-lived populations.

That was true of the ancient Greeks, but it has been true ever since in many parts of the world, especially in Europe but also in Asia. This continued to be true as the Western world expanded with colonialism and new societies formed. As Nina Teicholz notes, “saturated fats of every kind were consumed in great quantities. Americans in the nineteenth century ate four to five times more butter than we do today, and at least six times more lard” (The Big Fat Surprise).

To return to the Greeks, the modern population is not following a traditional diet. Prior to the World War era, pork and lard was abundant in the diet. But during wartime, the pig population was decimated from violence, disruption of the food system, and the confiscation of pigs to feed the military.

The same thing happened in the the most pig-obsessed culture in history, the long-lived Okinawans, when the Japanese during WWII stole or killed all of their pigs — as the Japanese perceived these shamanistic rural people on an isolated island to be a separate race and so were treated as less worthy. The Okinawans independence was dependent on their raising pigs and that was taken away from them.

When the Greeks and Okinawans were studied after the war, the diet observed was not the diet that existed earlier, the traditional lard-based and nutrient-dense diet that most of the population had spent most of their lives eating. They were long-lived not because of the lack of lard but because of it once having been abundant.

So, something like olive oil, once primarily used as a lamp fuel, was turned to in replacing the lost access to lard. Olive oil was a poverty food used out of necessity, not out of preference. It is great credit to modern marketing and propaganda that olive oil has been sold as a healthy oil when, in fact, most olive oil bought in the store is rancid. Olive oil is actually a fruit juice which is why it can’t be kept long before going bad, maybe why it gained a bad reputation in ancient Greece.

Lard and other animal fats, on the other hand, because they are heavily saturated have long shelf-life and don’t oxidize when used for cooking. Also, unlike vegetable oils, animal fats from pastured animals is filled with fat-soluble vitamins and omega-3 fatty acids, essential to health and longevity. How did this traditional knowledge, going back to the ancient world, get lost in a single generation of devastating war?

* * *

American Heart Association’s “Fat and Cholesterol Counter” (1991)

Even hydrogenated fat gets blamed on saturated fat, since the hydrogenation process turns some small portion of it saturated, which ignores the heavy damage and inflammatory response caused by the oxidization process (both in the industrial processing and in cooking). Not to mention those hydrogenated fats as industrial seed oils are filled with omega-6 fatty acids, the main reason they are so inflammatory. Saturated fat, on the other hand, is not inflammatory at all. This obsession with saturated fat is so strange. It never made any sense from a scientific perspective. When the obesity epidemic began and all that went with it, the consumption of saturated fat by Americans had been steadily dropping for decades, ever since the invention of industrial seed oils in the late 1800s and the fear about meat caused by Upton Sinclair’s muckraking journalism, The Jungle, about the meatpacking industry.

The amount of saturated fat and red meat has declined over the past century, to be replaced with those industrial seed oils and lean white meat, along with fruits and vegetables — all of which have been increasing.** Chicken, in particular, replaced beef and what stands out about chicken is that, like those industrial seed oils, it is high in the inflammatory omega-6 fatty acids. How could saturated fat be causing the greater rates of heart disease and such when people were eating less of it. This scapegoating wasn’t only unscientific but blatantly irrational. All of this info was known way back when Ancel Keys went on his anti-fat crusade (The Creed of Ancel Keys). It wasn’t a secret. And it required cherrypicked data and convoluted rationalizations to explain away.

Worse than removing saturated fat when it’s not a health risk is the fact that it is actually an essential nutrient for health: “How much total saturated do we need? During the 1970s, researchers from Canada found that animals fed rapeseed oil and canola oil developed heart lesions. This problem was corrected when they added saturated fat to the animals diets. On the basis of this and other research, they ultimately determined that the diet should contain at least 25 percent of fat as saturated fat. Among the food fats that they tested, the one found to have the best proportion of saturated fat was lard, the very fat we are told to avoid under all circumstances!” (Millie Barnes, The Importance of Saturated Fats for Biological Functions).

It is specifically lard that has been most removed from the diet, and this is significant as lard was a central to the American diet until this past century: “Pre-1936 shortening is comprised mainly of lard while afterward, partially hydrogenated oils came to be the major ingredient” (Nina Teicholz, The Big Fat Surprise, p. 95); “Americans in the nineteenth century ate four to five times more butter than we do today, and at least six times more lard” (p. 126). And what about the Mediterranean people who supposedly are so healthy because of their love of olive oil? “Indeed, in historical accounts going back to antiquity, the fat more commonly used in cooking in the Mediterranean, among peasants and the elite alike, was lard.” (p. 217).

Jason Prall notes that long-lived populations ate “lots of meat” and specifically, “They all ate pig. I think pork was the was the only common animal that we saw in the places that we went” (Longevity Diet & Lifestyle Caught On Camera w/ Jason Prall). The infamous long-lived Okinawans also partake in everything from pigs, such that their entire culture and religion was centered around pigs (Blue Zones Dietary Myth). Lard, in case you didn’t know, comes from pigs. Pork and lard is found in so many diets for the simple reason pigs can live in diverse environments, from mountainous forests to tangled swamps to open fields, and they are a food source available year round.

Blue Zones Dietary Myth

And one of the animal foods so often overlooked is lard: “In the West, the famous Roseto Penssylvanians also were great consumers of red meat and saturated fat. Like traditional Mediterraneans, they ate more lard than olive oil (olive oil was too expensive for everyday cooking and too much in demand for other uses: fuel, salves, etc). Among long-lived societies, one of the few commonalities was lard, as pigs are adaptable creatures that can be raised almost anywhere” (Eat Beef and Bacon!). […]

Looking back at their traditional diet, Okinawans have not consumed many grains, added sugars, industrial vegetable oils, or highly processed foods and they still eat less rice than other Japanese: “Before 1949 the Okinawans ate NO Wheat and little rice” (Julianne Taylor, The Okinawan secret to health and longevity – no wheat?). Also, similar to the Mediterranean people (another population studied after the devastation of WWII) who didn’t use much olive oil until recently, Okinawans traditionally cooked everything in lard that would have come from nutrient-dense pigs, the fat being filled with omega-3s and fat-soluble vitamins. Also, consider that most of the fat in lard is monounsaturated, the same kind of fat that is deemed healthy in olive oil.

“According to gerontologist Kazuhiko Taira, the most common cooking fat used traditionally in Okanawa is a monounsaturated fat-lard. Although often called a “saturated fat,” lard is 50 percent monounsaturated fat (including small amounts of health-producing antimicrobial palmitoleic acid), 40 percent saturated fat and 10 percent polyunsaturated. Taira also reports that healthy and vigorous Okinawans eat 100 grams each of pork and fish each day [7]” (Wikipedia, Longevity in Okinawa).

It’s not only the fat, though. As with most traditional populations, Okinawans ate all parts of the animal, including the nutritious organ meat (and the skin, ears, eyes, brains, etc). By the way, besides pork, they also ate goat meat. There would have been a health benefit from their eating some of their meat raw (e.g., goat) or fermented (e.g., fish), as some nutrients are destroyed in cooking. The small amounts of soy that Okinawans ate in the past was mostly tofu fermented for several months, and fermentation is one of those healthy preparation techniques widely used in traditional societies. They do eat some unfermented tofu as well, but I’d point out that it typically is fried in lard or used to be. […]

The most popular form of pork in the early 1900s was tonkatsu, by the way originally fried in animal fat according to an 1895 cookbook (butter according to that recipe but probably lard before that early period of Westernization). “Several dedicated tonkatsu restaurants cropped up around the 1920s to ’40s, with even more opening in the ’50s and ’60s, after World War II — the big boom period for tonkatsu. […] During the Great Depression of the 1930s, a piece of tonkatsu, which could be bought freshly cooked from the butcher, became the ultimate affordable payday treat for the poor working class. The position of tonkatsu as everyman food was firmly established.” This pork-heavy diet was what most Japanese were eating prior to World War II, but it wouldn’t survive the conflict when food deprivation came to afflict the population long afterwards.

Comment by gp

I just finished reading The Blue Zones and enjoyed it very much, but I was wondering about something that was not addressed in great detail. All of the diets discussed other than the Adventists (Sardinia, Okinawa and Nicoya) include lard, which I understand is actually used in significant quantities in some or all of those places. You describe (Nicoyan) Don Faustino getting multiple 2-liter bottles filled with lard at the market. Does he do this every week, and if so, what is he using all of that lard for? In Nicoya and Sardinia, eggs and dairy appear to play a large role in the daily diet. Your quote from Philip Wagner indicates that the Nicoyans were eating eggs three times a day (sometimes fried in lard), in addition to some kind of milk curd.

The Blue Zones Solutions by Dan Buettner
by Julia Ross (another version on the author’s website)

As in The Blue Zones, his earlier paean to the world’s traditional diets and lifestyles, author Buettner’s new book begins with detailed descriptions of centenarians preparing their indigenous cuisines. He finishes off these introductory tales with a description of a regional Costa Rican diet filled with eggs, cheese, meat and lard, which he dubs “the best longevity diet in the world.”

Then Buettner turns to how we’re to adapt this, and his other model eating practices, into our current lives. At this point he suddenly presents us with a twenty-first century pesco-vegan regimen that is the opposite of the traditional food intake that he has just described in loving detail. He wants us to fast every twenty-four hours by eating only during an eight-hour period each day. He wants us to eat almost no meat, poultry, eggs or dairy products at any time. Aside from small amounts of olive oil, added fats are not even mentioned, except to be warned against.

How Much Soy Do Okinawans Eat?
by Kaayla Daniel

There are other credibility problems with the Okinawa Centenarian Study, at least as interpreted in the author’s popular books. In 2001, Dr. Suzuki reported in the Asia Pacific Journal of Clinical Nutrition that “monounsaturates” were the principal fatty acids in the Okinawan diet. In the popular books, this was translated into a recommendation for canola oil, a genetically modified version of rapeseed oil developed in Canada that could not possibly have become a staple of anyone’s diet before the 1980s. According to gerontologist Kazuhiko Taira, the most common cooking fat used traditionally in Okanawa is a very different monounsaturated fat-lard. Although often called a “saturated fat,” lard is 50 percent monounsaturated fat (including small amounts of health-producing antimicrobial palmitoleic acid), 40 percent saturated fat and 10 percent polyunsaturated. Taira also reports that healthy and vigorous Okinawans eat 100 grams each of pork and fish each day. Thus, the diet of the long-lived Okinawans is actually very different from the kind of soy-rich vegan diet that Robbins recommends.

Nourishing Diets:
How Paleo, Ancestral and Traditional Peoples Really Ate

by Sally Fallon Morell
pp. 263-270
(a version of the following can be found here)

From another source, 7 we learn that:

Traditional foods of Okinawa are extremely varied, remarkably nutrient-dense as are all traditional foods and strictly moderated with the philosophy of hara hachi bu [eat until you are 80 percent full]. While the diet of Okinawa is, indeed, plant-based it is most certainly not “low fat” as has been posited by some writer-researchers about the native foods of Okinawa. Indeed, all those stir fries of bitter melon and fresh vegetables found in Okinawan bowls are fried in lard and seasoned with sesame oil. I remember fondly that a slab of salt pork graced every bowl of udon I slurped up while living on the island. Pig fat is not, as you can imagine, a low-fat food yet the Okinawans are fond of it. Much of the fat consumed is pastured as pigs are commonly raised at home in the gardens of Okinawan homes. Pork and lard, like avocado and olive oil, are a remarkably good source of monounsaturated fatty acid and, if that pig roots around on sunny days, it is also a remarkably good source of vitamin D.

The diet of Okinawa also includes considerably more animal products and meat—usually in the form of pork—than that of the mainland Japanese or even the Chinese. Goat and chicken play a lesser, but still important, role in Okinawan cuisine. Okinawans average about 100 grams or one modest portion of meat per person per day. Animal foods are important on Okinawa and, like all food, play a role in the population’s general health, well-being and longevity. Fish plays an important role in the cooking of Okinawa as well. Seafoods eaten are various and numerous—with Okinawans averaging about 200 grams of fish per day.

Buettner implies that the Okinawans do not eat much fish, but in fact, they eat quite a lot, just not as much as Japanese mainlanders.

The Okinawan diet became a subject of interest after the publication of a 1996 article in Health Magazine about the work of gerontologist Kazuhiko Taira, 8 who described the Okinawan diet as “very healthy—and very, very greasy.” The whole pig is eaten, he noted, everything from “tails to nails.” Local menus offer boiled pig’s feet, entrail soup and shredded ears. Pork is marinated in a mixture of soy sauce, ginger, kelp and small amounts of sugar, then sliced or chopped for stir-fry dishes. Okinawans eat about 100 grams of meat per day—compared to 70 grams in Japan and just over 20 grams in China—and at least an equal amount of fish, for a total of about 200 grams per day, compared to 280 grams per person per day of meat and fish in America. Lard—not vegetable oil—is used in cooking. […]

What’s clear is that the real Okinawan longevity diet is an embarrassment to modern diet gurus. The diet was and is greasy and good, with the largest proportion of calories coming from pork and pork fat, and many additional calories from fish; those who reach old age eat more animal protein and fat than those who don’t. Maybe that’s what gives the Okinawans the attitudes that Buettner so admires, “an affable smugness” that makes it easy to “enjoy today’s simple pleasures.”

Hara Hachi Bu: Lessons from Okinawa
by Jenny McGruther

Traditional foods of Okinawa are extremely varied, remarkably nutrient-dense as are all traditional foods and strictly moderated with the philosophy of hara hachi bu. While the diet of Okinawa is, indeed, plant-based it is most certainly not “low fat” as has been posited by some writer-researchers about the native foods of Okinawa. Indeed, all those stirfries of bittermelon and fresh vegetables found in Okinawan bowls are fried in lard and seasoned with sesame oil. I remember fondly that a slab of salt pork graced every bowl of udon I slurped up while living on the island. Pig fat is not, as you can imagine, a low-fat food yet the Okinawans are fond of it. Much of the fat consumed is pastured as pigs are commonly raised at home in the gardens of Okinawan homes. Pork and lard, like avocado and olive oil, are a remarkably good source of monounsaturated fatty acid and, if that pig roots around on sunny days, it is also a remarkably source of vitamin D.

“What would Mister Rogers do?”

“He had faith in us, and even if his faith turns out to have been misplaced, even if we have abandoned him, he somehow endures, standing between us and our electrified antipathies and recriminations like the Tank Man of Tiananmen Square in a red sweater.”
~Tom Junod, My Friend Mister Rogers

A Beautiful Day in the Neighborhood is an inspiring and, in the end, a challenging portrayal of Fred Rogers, AKA ‘Mister Rogers’. It took some suspension of disbelief, though. Tom Hanks does as good of a job as is possible, but no one can replace the real thing. Mr. Rogers was distinctive in appearance and behavior. The production team could have used expensive CGI to make Hanks look more like the real man, but that was not necessary. It wasn’t a face that made the children’s tv show host so well respected and widely influential. A few minutes in, I was able to forget I was watching an actor playing a role and became immersed in the personality and the moral character that was cast upon the screen of imagination, the movie presented as if a new episode of Mister Rogers’ Neighborhood had just been released.

The way the movie was done was highly effective. It was based on an Esquire article, Can You Say … Hero? by Tom Junod. It was jarring at first in taking a roundabout approach, but it might have been the only way to go about it for the intended purpose. Fred Rogers appears to have been a person who was genuinely and fully focused on other people, not on himself. So a biopic that captures his essence requires demonstrating this concern for others, which makes him a secondary character in the very movie that is supposedly about him. We explore his world by experiencing the profound impact he had on specific people, in this case not only Junod but also his family, while there are other scenes showing the personable moments of Mr. Rogers meeting with children. The story arc is about Junod’s change of heart, whereas Mr. Rogers remains who he was from the start.

This leaves Mr. Rogers himself as an unknown to viewers not already familiar with the biographical details. We are shown little about his personal life and nothing about his past, but the narrow focus helps to get at something essential. We already were given a good documentary about him from last year. This movie was serving a different purpose. It offers a window to peer through, to see how he related and what it meant for those who experienced it. Part of the hidden background was his Christianity, as he was an ordained Presbyterian minister. Yet even as Christianity inspired him, he never put his faith out in the public view. As Jesus taught to pray in secret, Fred Rogers took it one step further by keeping his faith almost entirely hidden. He didn’t want to force his beliefs onto others. The purpose of religion is not dogma or outward forms. If religion matters at all, it’s about how it transforms people. That is what Mr. Rogers, as a man and a media personality, was all about.

Some people don’t understand this and so don’t grasp what made him so special. Armond White at National Review wrote that, “Heller and screenwriters Micah Fitzerman-Blue and Noah Harpster don’t show enough faith in Rogers’ remedies—and not enough interest in their religious origins. In short, the movie seems wary of faith (it briefly mentions that Rogers was an ordained minister) and settles for secular sentimentality to account for his sensibility and behavior. This not only weakens the film, but it also hobbles Hanks’s characterization” (Christian Faith Is the Missing Ingredient in A Beautiful Day in the Neighborhood). That misses the entire message being conveyed, not only the message of the movie but, more importantly, the message of Mr. Rogers himself. As Greg Forster subtly puts it, “that is of course the whole goddamned point here” (Pass the Popcorn: Anything Mentionable Is Managable).

To have put Mr. Roger’s Christianity front and center would be to do what Mr. Rogers himself intentionally avoided. He met people where they were at, rather than trying to force or coerce others into his belief system, not that he would have thought of his moral concern as a belief system. He was not an evangelical missionary seeking to preach and proselytize, much less attempting to save the lost souls of heathenish children or make Christian America great again. In his way of being present to others, he was being more Christ-like than most Christians, as Jesus never went around trying to convert people. Jesus wasn’t a ‘good Christian’ and, by being vulnerable in his humanity, neither was Fred Rogers. Rather, his sole purpose was just to be kind to others. Religion, in its highest form, is about how one relates to others and to the world. Thomas Paine voiced his own radical faith with the words, “The World is my country, all mankind are my brethren, and to do good is my religion.” I suspect Mr. Rogers would have agreed. It really is that simple or it should be.

That childlike directness of his message, the simplicity of being fully present and relating well, that was the magical quality of the show, Mister Rogers’ Neighborhood. I didn’t appreciate it when I was a kid. It was a fixture of my childhood, a show I watched and that was all. But looking back on it, I can sense what made it unique. Like the man himself, the show was extremely simple, one might call it basic, demonstrated by the same ragged puppets he used his entire career. This was no fancy Jim Henson muppet production. What made it real and compelling to a child was what the people involved put into it, not only Fred Rogers but so many others who were dedicated to the show. Along with the simplicity, there was a heartfelt sincerity to it all. The scenes with the puppets, Daniel Striped Tiger most of all, were often more emotionally raw and real than what is typically done by professional actors in Hollywood movies.

That is what stands out about Tom Hank’s performance in bringing this to life. He is one of the few actors who could come close to pulling it off and even his attempt was imperfect. But I have to give Hanks credit for getting the essence right. The emotional truth came through. Sincerity is no small thing, in this age of superficiality and cynicism. To call it a breath of fresh air is a criminal understatement. Mr. Rogers was entirely committed to being human and acknowledging the humanity in others. That is such a rare thing. I’m not sure how many people understood that about him, what exactly made him so fascinating to children and what created a cult-like following among the generations who grew up watching his show. As a character says about the drug D in A Scanner Darkly, “You’re either on it or you’ve never tried it.”

Some people claim that “sincerity is bullshit” (Harry Frankfurt), a sentiment I understand in feeling jaded about the world. But I must admit that Fred Rogers’ sincerity most definitely and deeply resonates for me, based on my own experience in the New Thought worldview I was raised in, a touchy-feel form of Christianity where emotional authenticity trumps outward form, basically Protestantism pushed to its most extreme endpoint. Seeing the emotional rawness in Mr. Rogers’ life, although coming from a different religious background than my own, reminded me of the sincerity that I’ve struggled with in myself. I’ve always been an overly sincere person and often overly serious, that is how I think of myself… but can anyone really ever be too sincere? The message of Mr. Rogers is that we all once were emotionally honest when children and only later forgot this birthright. It remains in all of us and that core of our humanity is what he sought to touch upon, and indeed many people responded to this and felt genuinely touched. The many testimonies of ordinary people to Mr. Rogers’ legacy are inspiring.

This worldview of authenticity was made clear in one particular scene in the movie. “Vogel says he believes his dining companion likes “people like me … broken people.” Rogers is having none of it. “I don’t think you are broken,” Rogers begins, speaking slowly and deliberately. “I know you are a man of conviction, a person who knows the difference between what is wrong and what is right. Try to remember that your relationship with your father also helped to shape those parts. He helped you become what you are”” (Cathleen Falsani, Meditating On Love and Connection with Mr. Rogers and C.S. Lewis). That dialogue was not pulled from real life, according to Tom Junod in his latest piece My Friend Mister Rogers, but even Junod found himself emotionally moved when watching the scene. The point is that what mattered to Fred Rogers was conviction and he lived his life through his own conviction, maybe a moral obligation even. The man was exacting in his discipline and extremely intentional in everything he did, maybe even obsessive-compulsive, as seen in how he maintained his weight at exactly 143 lbs throughout his adult life and in how he kept FBI-style files on all of his friends and correspondents. He had so little interest in himself that even his wife of 50 years knew little about his personal experience and memories that he rarely talked about. His entire life, his entire being apparently was focused laser-like on other people.

He was not a normal human. How does someone become like that? One gets the sense that Mr. Rogers in the flesh would have, with humility, downplayed such an inquiry. He let on that he too was merely human, that he worried and struggled like anyone else. The point, as he saw it, was that he was not a saint or a hero. He was just a man who felt deeply and passionately moved to take action. But where did that powerful current of empathy and compassion come from? He probably would have given all credit to God, as his softspoken and often unspoken faith appears to have been unwavering. Like the Blues Brothers, he was a man on a mission from God. He was not lacking in earnestness. And for those of us not so fully earnest, it can seem incomprehensible that such a mortal human could exist: “He was a genius,” Junod wrote, “he had superpowers; he might as well have been a friendly alien, thrown upon the rocks of our planet to help us find our way to the impossible possibility that we are loved” (My Friend Mister Rogers). Yet for all the easy ways it would be to idolize him or dismiss him, he continues to speak to the child in all of us. Maybe ‘Mister Rogers’ was not a mystery, but instead maybe we are making it too complicated. We need to step back and, as he so often advised, remember what it was like to be a child.

Fred Rogers was a simple man who spoke simply and that is what made him so radically challenging. “Indeed, what makes measuring Fred’s legacy so difficult is that Fred’s legacy is so clear.” Junod goes on to say, “It isn’t that he is revered but not followed; so much as he is revered because he is not followed—because remembering him as a nice man is easier than thinking of him as a demanding one. He spoke most clearly through his example, but our culture consoles itself with the simple fact that he once existed. There is no use asking further questions of him, only of ourselves. We know what Mister Rogers would do, but even now we don’t know what to do with the lessons of Mister Rogers.” He might as well have been talking about Jesus Christ, the divine made flesh. But if there was spiritual truth in Fred Rogers, he taught that it was a spiritual truth in all of us, that we are children of God. Rather than what would Mister Rogers do, what will we do in remembering him?

Native Americans Feasted Some But Mostly Fasted

“There are to be found, among us, a few strong men and women — the remnant of a by-gone generation, much healthier than our own — who can eat at random, as the savages do, and yet last on, as here and there a savage does, to very advanced years. But these random-shot eaters are, at most, but exceptions to the general rule, which requires regularity.”
~William Andrus Alcott, 1859

Three Squares: The Invention of the American Meal
by Abigail Carroll, pp. 12-14

Encountering the tribal peoples of North America, European explorers and settlers found themselves forced to question an institution they had long taken for granted: the meal. “[They] have no such thing as set meals breakfast, dinner or supper,” remarked explorer John Smith. Instead of eating at three distinct times every day, natives ate when their stomachs cued them, and instead of consuming carefully apportioned servings, they gleaned a little from the pot here and there. English colonists deplored this unstructured approach. They believed in eating according to rules and patterns—standards that separated them from the animal world. But when it came to structure, colonists were hardly in a position to boast. Though they believed in ordered eating, their meals were rather rough around the edges, lacking the kind of organization and form that typifies the modern meal today. Hardly well defined or clean-cut, colonial eating occasions were messy in more ways than one. Perhaps this partially explains why explorers and colonists were so quick to criticize native eating habits—in doing so, they hid the inconsistencies in their own. 3

Colonists found Native American eating habits wanting because they judged them by the European standard. For Europeans, a meal combined contrasting components—usually cereals, vegetables, and animal protein. Heat offered an additional desirable contrast. Swedish traveler Peter Kalm noted that many “meals” consumed by the natives of the mid-Atlantic, where he traveled in the mid-eighteenth century, consisted simply of “[maple] sugar and bread.” With only two ingredients and a distinct lack of protein, not to mention heat, this simplistic combination fell short of European criteria; it was more of a snack. Other typical nonmeals included traveling foods such as nocake (pulverized parched cornmeal to which natives added water on the go) and pemmican (a dense concoction of lean meat, fat, and sometimes dried berries). Hunters, warriors, and migrants relied on these foods, designed to be eaten in that particularly un-meal-like way in which John Williams ate his frozen meat on his journey to Québec: as the stomach required it and on the go. 4

Jerked venison and fat, chewed as one traversed the wilderness, was not most colonists’ idea of a proper meal, and if natives’ lack of sufficient contrasting components and the absence of a formal eating schedule puzzled colonists, even more mystifying was natives’ habit of going without meals, and often without any food at all, for extended periods. Jesuit missionary Christian LeClercq portrayed the Micmac of the Gaspé Peninsula in Canada as a slothful people, preserving and storing only a token winter’s supply: “They are convinced that fifteen to twenty lumps of meat, or of fish dried or cured in the smoke, are more than enough to support them for the space of five to six months.” LeClercq and many others did not realize that if natives went hungry, they did so not from neglect but by choice. Fasting was a subsistence strategy, and Native Americans were proud of it. 5

Throughout the year, Native Americans prepared for times of dearth by honing their fasting skills. They practiced hunger as a kind of athletic exercise, conditioning their bodies for the hardships of hunting, war, and seasonal shortages. According to artist George Catlin, the Mandan males in what are now the Dakotas “studiously avoided . . . every kind of excess.” An anthropologist among the Iroquois observed that they were “not great eaters” and “seldom gorged themselves.” To discourage gluttony, they even threatened their children with a visit from Sago’dakwus, a mythical monster that would humiliate them if it caught them in the act of overeating. 6

Native and European approaches to eating came to a head in the vice of gluttony. Many tribal peoples condemned overeating as a spiritual offense and a practice sure to weaken manly resolve and corrupt good character. Europeans also condemned it, largely for religious reasons, but more fundamentally because it represented a loss of control over the animal instincts. In the European worldview, overindulgence was precisely the opposite of civility, and the institution of the meal guarded against gluttony and a slippery descent into savagery. The meal gave order to and set boundaries around the act of eating, boundaries that Europeans felt native practices lacked. As explorers and colonists defended the tradition of the meal, the institution took on new meaning. For them, it became a subject of pride, serving as an emblem of civilization and a badge of European identity. 7

Europeans viewed Native Americans largely as gluttons. Because whites caught only fleeting glimpses of the complex and continually shifting lives of Native Americans, they were liable to portray the native way of life according to a single cultural snapshot, which, when it came to food, was the posthunt feast. It was well known that natives ate much and frequently during times of abundance. John Smith recorded that when natives returned from the hunt with large quantities of bear, venison, and oil, they would “make way with their provision as quick as possible.” For a short time, he explained, “they have plenty and do not spare eating.” White witnesses popularized the image of just such moments of plenty as typical. 8

Although Native Americans were hardly gluttons, Europeans, fascinated by the idea of a primitive people with a childlike lack of restraint, embraced the grossly inaccurate stereotype of the overeating Indian. William Wood portrayed the natives of southern New England as gorging themselves “till their bellies stand forth, ready to split with fullness.” A decidedly strange Anglo-American amusement involved watching Native Americans relish a meal. “Why,” asked George Catlin, “[is it] that hundreds of white folks will flock and crowd round a table to see an Indian eat?” With a hint of disappointment, William Wood recorded the appetites of tribes people invited to an English house to dine as “very moderate.” Wood was uncertain whether to interpret this reserve as politeness or timidity, but clearly he and his fellow English spectators had not expected shy and tempered eaters. 9

One culture’s perception of another often says more about the perceiver than the perceived. Although settlers lambasted natives for gluttony, whites may have been the real gluttons. According to more than one observer, many a native blushed at Europeans’ bottomless stomachs. “The large appetites of white men who visited them were often a matter of surprise to the Indians who entertained them,” wrote a nineteenth-century folklorist among the Iroquois. Early anthropologist Lewis Morgan concluded that natives required only about one-fifth of what white men consumed, and he was skeptical of his own ability to survive on such a paucity of provisions. 10

Through their criticisms, exaggerations, and stereotypes, colonists distanced themselves from a population whose ways appeared savage and unenlightened, and the organized meal provided a touchstone in this clash of cultures. It became a yardstick by which Europeans measured culture and a weapon by which they defended their definition of it. They had long known what a meal was, but now, by contrast, they knew firsthand what it was not. Encountering the perceived meal-less-ness of the natives brought the colonists’ esteemed tradition into question and gave them an opportunity to confirm their commitment to their conventions. They refused to approve of, let alone adapt to, the loose foodways of Native Americans and instead embraced all the more heartily a structured, meal-centered European approach to eating.