Old Debates Forgotten

Since earlier last year, I’ve done extensive reading, largely but not entirely focused on health. This has particularly concerned diet and nutrition, although it has crossed over into the territory of mental health with neurocognitive issues, addiction, autism, and much else, with my personal concern being that of depression. The point of this post is to consider some of the historical background. Before I get to that, let me explain how my recent interests have developed.

What got me heading in this direction was the documentary The Magic Pill. It’s about the paleo diet. The practical advice was worth the time spent, though other things drew me into the the larger arena of low-carb debate. The thing about the paleo diet is that it offers a framework of understanding that includes many scientific fields involving health beyond only diet and also it explores historical records, anthropological research, and archaeological evidence. The paleo diet community in particular, along with the low-carb diet community in general, is also influenced by the traditional foods approach of Sally Fallon Morrell. She is the lady who, more than anyone else, popularized the work of Weston A. Price, an early 20th century dentist who traveled the world and studied traditional populations. I was already familiar with this area from having reading Morrell’s first book in the late ’90s or early aughts.

New to me was the writings of Gary Taubes and Nina Teicholz, two science journalists who have helped to shift the paradigm in nutritional studies. They accomplished this task by presenting not only detailed surveys of the research and other evidence but in further contextualizing the history of powerful figures, institutions, and organizations that shaped the modern industrial diet. I didn’t realize how far back this debate went with writings on fasting for epilepsy found in ancient texts and recommendations of a low-carb diet (apparently ketogenic) for diabetes appearing in the 1790s, along with various low-carb and animal-based diets being popularized for weight-loss and general health during the 19th century, and then the ketogenic diet was studied for epilepsy beginning in the 1920s. Yet few know this history.

Ancel Keys was one of those powerful figures who, in suppressing his critics and silencing debate, effectively advocated for the standard American diet of high-carbs, grains, fruits, vegetables, and industrial seed oils. In The Magic Pill, more recent context is given in following the South African trial of Tim Noakes. Other documentaries have covered this kind of material, often with interviews with Gary Taubes and Nina Teicholz. There has been immense drama involved and, in the past, there was also much public disagreement and discussion. Only now is that returning to mainstream awareness in the corporate media, largely because social media has forced it out into the open. But what interests me is how old is the debate and often in the past much more lively.

The post-revolutionary era created a sense of crisis that, by the mid-19th century, was becoming a moral panic. The culture wars were taking shape. The difference back then was that there was much more of a sense of the connection between physical health, mental health, moral health, and societal health. As a broad understanding, health was seen as key and this was informed by the developing scientific consciousness and free speech movement. The hunger for knowledge was hard to suppress, although there were many attempts as the century went on. I tried to give a sense of this period in two massive posts, The Crisis of Identity and The Agricultural Mind. It’s hard to imagine what that must’ve been like. That scientific debate and public debate was largely shut down around the World War era, as the oppressive Cold War era took over. Why?

It is strange. The work of Taubes and Teicholz gives hint to what changed, although the original debate was much wider than diet and nutrition. The info I’ve found about the past has largely come from scholarship in other fields, such as historical and literary studies. Those older lines of thought are mostly treated as historical curiosities at this point, background info for the analysis of entirely other subjects. As for the majority of scientists, doctors and nutritionists these days, they are almost entirely ignorant of the ideologies that shaped modern thought about disease and health.

This is seen, as I point out, in how Galen’s ancient Greek theory of humors as incorporated into Medieval Christianity appears to be the direct source of the basic arguments for a plant-based diet, specifically in terms of the scapegoating of red meat, saturated fat and cholesterol. Among what I’ve come across, the one scholarly book that covers this in detail is Food and Faith in Christian Culture edited by Ken Albala and Trudy Eden. Bringing that into present times, Belinda Fettke dug up how so much of contemporary nutritional studies and dietary advice was built on the foundation of 19th-20th century vegan advocacy by the Seventh Day Adventists. I’ve never met anyone adhering to “plant-based” ideology who knows this history. Yet now it is becoming common knowledge in the low-carb world.

On the literary end of things, there is a fascinating work by Bryan Kozlowski, The Jane Austen Diet. I enjoyed reading it, in spite of never having cracked open a book by Jane Austen. Kozlowski, although no scholar, was able to dredge up much of interest about those post-revolutionary decades in British society. For one, he shows how obesity was becoming noticeable all the way back then and many were aware of the benefits of low-carb diets. He also makes clear that the ability to maintain a vegetable garden was a sign of immense wealth, not a means for putting much food on the tables of the poor — this is corroborated by Teicholz discussion of how gardening in American society, prior to modern technology and chemicals, was difficult and not dependable. More importantly, Kozlowski’s book explains what ‘sensibility’ meant back then, related to ‘nerves’ and ‘vapors’ and later on given the more scientific-sounding label of ‘neurasthenia’.

I came across another literary example of historical exegesis about health and diet, Sander L. Gilman’s Franz Kafka, the Jewish Patient. Kafka was an interesting case, as a lifelong hypochondriac who, it turns out, had good reason to be. He felt that he had inherited a weak constitution and blamed this on his psychological troubles, but more likely causes were urbanization, industrialization, and a vegetarian diet that probably also was a high-carb diet based on nutrient-depleted processed foods; and before the time when industrial foods were fortified and many nutritional supplements were available.

What was most educational, though, about the text was Gilman’s historical details on tuberculosis in European thought, specifically in relationship to Jews. To some extent, Kafka had internalized racial ideology and that is unsurprising. Eugenics was in the air and racial ideology penetrated everything, especially health in terms of racial hygiene. Even for those who weren’t eugenicists, all debate of that era was marked by the expected biases and limitations. Some theorizing was better than others and for certain not all of it was racist, but the entire debate maybe was tainted by the events that would follow. With the defeat of the Nazis, eugenics fell out of favor for obvious reasons and an entire era of debate was silenced, even many of the arguments that were opposed to or separate form eugenics. Then historical amnesia set in, as many people wanted to forget the past and instead focus on the future. That was unfortunate. The past doesn’t simply disappear but continues to haunt us.

That earlier debate was a struggle between explanations and narratives. With modernity fully taking hold, people wanted to understand what was happening to humanity and where it was heading. It was a time of contrasts which made the consequences of modernity quite stark. There were plenty of communities that were still pre-industrial, rural, and traditional, but since then most of these communities have died away. The diseases of civilization, at this point, have become increasingly normalized as living memory of anything else has disappeared. It’s not that the desire for ideological explanations has disappeared. What happened was, with the victory of WWII, a particular grand narrative came to dominate the entire Western world and there simply were no other grand narratives to compete with it. Much of the pre-war debate and even scientific knowledge, especially in Europe, was forgotten as the records of it were destroyed, weren’t translated, or lost perceived relevance.

Nonetheless, all of those old ideological conflicts were left unresolved. The concerns then are still concerns now. So many problems worried about back then are getting worse. The connections between various aspects of health have regained their old sense of urgency. The public is once again challenging authorities, questioning received truths, and seeking new meaning. The debate never ended and here we are again, and one could add that fascism also is back rearing its ugly head. It’s worrisome that the political left seems to be slow on the uptake. There are reactionary right-wingers like Jordan Peterson who are offering visions of meaning and also who have become significant figures in the dietary world, by way of the carnivore diet he and his daughter are on. T?hen there are the conspiratorial paleo-libertarians such as Tristan Haggard, another carnivore advocate.

This is far from being limited to carnivory and the low-carb community includes those across the political spectrum, but it seems to be the right-wingers who are speaking the loudest. The left-wingers who are speaking out on diet come from the confluence of veganism/vegetarianism and environmentalism, as seen with EAT-Lancet (Dietary Dictocrats of EAT-Lancet). The problem with this, besides much of this narrative being false (Carnivore is Vegan), is that it is disconnected from the past. The right-wing is speaking more to the past than is the left-wing, such as Trump’s ability to invoke and combine the Populist and Progressive rhetoric from earlier last century. The political left is struggling to keep up and is being led down ideological dead-ends.

If we want to understand our situation now, we better study carefully what was happening in centuries past. We are having the same old debates without realizing it and we very well might see them lead to the same kinds of unhappy results.

Local Newspapers Were The Original Social Media

Local newspapers were the original social media. It reminds me of how, in early America, even personal letters would get published in the newspaper and sometimes without consent. A letter coming from a faraway friend or family member might mean news for the whole community. Or else it would make for great scandalous material your opponent might get a hold of (read about America’s founding era).

Privacy wasn’t always highly prized in centuries past. What was going on in your life was everyone’s business and so everyone had a right to know what you’ve been doing, as you had a right to know what everyone else has been doing. Apparently, you were wise to write your letters as if anyone might read them. That is still wise advice in writing anything today, something we’re regularly reminded of when some Tweet comes back to haunt someone as news.

More generally, newspapers were where people looked to learn about anything and everything, as there were few other sources of information. A daily newspaper told you what was going on in your little world and indeed the focus was almost entirely local. Whatever was even mildly significant would get reported. Look at that old newspaper — they really packed in the articles with small print and few pictures.

This still can be seen in some small communities. When my family and I were traveling out West, we passed through an isolated Indian Reservation, probably with a small population. There was a correspondingly small local newspaper. All the articles were about such things as a teen winning an award at school, the public library having purchased some new books, the ladies knitting club planning a bake sale for next Wednesday, etc.

This kind of news is only newsworthy because, in a tight-knit community, everyone is familiar with everyone else. These people are your neighbors and coworkers, friends and family. They go to the same church you do. Their kids go to school with your kids. You see them at the post office, bank, and store. It’s common knowledge about what goes on at Mrs. Jeffries’ card club, who attends, and the kind of person Mrs. Jeffries is. It’s part of a web of local information, what might be called gossip.

Now we have social media for that purpose where you keep close tabs on those you personally know. I might have little sense of what is going on in the lives of my brothers and their families if not for their Facebook postings, despite all of us living close to one another. It could be amusing to publish a monthly newspaper for reporting of family news where all the articles are based on the details gathered from social media, although I think there would only be one edition of the publication before everyone blocked me.

* * *

JULY 10, 1944 – ROBOT BOMBS, NAZIS, SEXISM AND HOME RENTAL AT $50 PER MONTH
by Johnny Joo

1940s painesville ohio newspaper

One thing I found interesting was that stories were published about such mundane things, such as – “Mrs. Jeffries Is Hostess To Club” where it goes on to tell about Mrs. Ralph Jeffries and her card club, which she had hosted at her house on a Wednesday night. A following article talks about a family hosting a Sunday dinner at their home.

1940s painesville ohio newspaper

1940s painesville ohio newspaper

The paper also throws out a whole bunch of personal information about people:

“Miss Suzanne Miller of Cleveland has returned to her home after spending two weeks at the home of her grandmother, Mrs. H. G. Early, and her cousin Alice Young, of 111 E. Jackson St.”

“Mr. and Mrs. George Yager are now residing at their newly furnished apartment at 236 Courtland St.”

and many more to go along with those ^

If things like this were shared today, people would be throwing a fit (never mind that people share their entire lives on social media)

Enchantment of Capitalist Religion

We Have Never Been Disenchanted
by Eugene McCarraher, excerpt

“The world does not need to be re-enchanted, because it was never disenchanted in the first place.  Attending primarily to the history of the United States, I hope to demonstrate that capitalism has been, as Benjamin perceived, a religion of modernity, one that addresses the same hopes and anxieties formerly entrusted to traditional religion.  But this does not mean only that capitalism has been and continues to be “beguiling” or “fetishized,” and that rigorous analysis will expose the phantoms as the projections they really are.  These enchantments draw their power, not simply from our capacity for delusion., but from our deepest and truest desires — desires that are consonant and tragically out of touch with the dearest freshness of the universe.  The world can never be disenchanted, not because our emotional or political or cultural needs compel us to find enchantments — though they do — but because the world itself, as Hopkins realized, is charged with the grandeur of God…

“However significant theology is for this book, I have relied on a sizable body of historical literature on the symbolic universe of capitalism.  Much of this work suggests that capitalist cultural authority cannot be fully understood without regard to the psychic, moral, and spiritual longings inscribed in the imagery of business culture.”

Has Capitalism Become Our Religion?

“As a Christian, I reject the two assumptions found in conventional economics: scarcity (to the contrary, God has created a world of abundance) and rational, self-seeking, utility-maximizing humanism (a competitive conception of human nature that I believe traduces our creation in the image and likeness of God). I think that one of the most important intellectual missions of our time is the construction of an economics with very different assumptions about the nature of humanity and the world.”

 

 

To Be Fat And Have Bread

The obsession with body fat is an interesting story. It didn’t begin a few generations ago but goes back centuries. But maybe that shouldn’t be surprising.

That was the colonial era when the diet was transformed by imperial trade of foreign foods. I might note that this included previously rare or never before seen varieties of fattening carbohydrates: sugar, potatoes, corn, rice, etc. The old feudal system was ending and entirely different forms of food production and diets were developing, especially for the then landless peasants. Hunting, gathering and grazing for the commoners definitely would have been on the decline for a while at that point, as the last of the commons had been privatized. The loss of access to wild game would take longer in the colonies, but eventually it happened everywhere.

The last stage of that shift overlapped with the beginnings of industrialization and agricultural improvements. In the 19th century, change in wheat surpluses and hence costs and prices. Agriculture boomed as fewer people were employed in it. There was also a sudden obsession with gender roles and social roles in general, such as the post-revolutionary expectation of the mother to make citizens out of her children. Bread-making, a once uncommon activity for Americans, became increasingly important to the normative identity of family life and the symbolic maintenance of the social order.

Regular consumption of wheat bread was once limited to the wealthy and that is how refined bread gained its moral association with the refined class. Only the wealthy could afford wheat prior to the 19th century, as prior to that the poor were forced to rely upon cheaper grains and grain substitutes at a time when bread was regularly adulterated with bark, sawdust, chalk, etc. Poverty breads, in the previous centuries, often were made with no grain at all.* For wheat and especially heavily refined white bread to become available to all walks of life meant an upsurge of the civilizing process. The obsession with middle class life took hold and so cookbooks were produced in large numbers.

In a growing reactionary impulse, there was a nostalgic tendency toward invented traditions. Bread took on new meanings that then were projected onto the past. It wasn’t acknowledged how radical was the industrial agriculture and industrial milling that made all of this possible. And the disconnection is demonstrated by the simultaneous promotion of the grain production of this industrial age and the complaint about how industrialized life was destroying all that was good. Bread, as a symbol, transcended these mere details.

With the aristocracy having been challenged during the Revolutionary Era the refinement of the refined class that once was admired had then become suspect. The ideology of whole foods began to emerge and had some strong proponents. But by the end of the 1800s, the ideal of refinement gained prominence again and prepared the way for the following century of ever greater industrialization of processed foods. Refinement represented progress. Only after more extensive refinement led to mass malnourishment, near the end of that century and heading into the next, did whole foods once again capture the public imagination.

Then we enter the true era of fat obsession, fat blaming, and dieting, endless dieting. Eat your whole grains, get your fiber, make sure you get enough servings of fruits, and veggies, and don’t forget to exercise. Calories in, calories out. Count your calories, count your carbs, count your steps. Count every last one of them. Still, the basic sides of the debate remain the same: fewer carbohydrates vs less meat, whole foods vs refined foods, barbaric lifestyle vs civilizing process, individual moral failure vs societal changes, etc. One theme that runs through dietary advice from the ancient world to the present is that there is a close link between physical health, mental health, and moral health — the latter erupting as moral panic and moral hygiene. But what stands about the modern era, beginning in the 1600s, is that it was observed that psychological problems were mostly seen among the well-to-do.

This was often blamed on luxury and sometimes on meat (a complaint often about animals raised unnaturally in confinement and probably fed grain, the early equivalent of concerns about factory farming; but also a complaint about the introduction of foreign spices and use of fancy sauces to make meat more appetizing), although there was beginning to be an awareness that a high-carb diet might be playing a role in that it was often noted that the morbidly obese ate lots of pastries, fruit pies, and such. The poor didn’t have much access to wheat and sugar before the 1800s, but the wealthy had plenty of such foods centuries earlier. Meat consumption didn’t change much during that era of colonial trade. What did change the most was availability of starchy and sugary foods, and the wealthy consumed them in great proportions. Meat had always been a desirable food going back to earliest hominid evolution. Modern agriculture and global trade, however, entirely transformed the human diet with the introduction of massive amounts of carbohydrates.

It’s strange that right from the beginning of the modern era there were those pushing for a vegetarian diet, not many but their voices were being heard for the first time. Or maybe it wasn’t so strange. Prior to the modern era, a vegetarian diet so far north in Europe would have been impossible. It was only the elite promoting vegetarianism as only they could afford a vegetarian diet year round, in buying expensive plant-based foods that were often shipped in from far away. Although plant foods were expensive at the time, they were available to those who had plenty of money. But during the Middle Ages and earlier, vegetarianism for the most part was not an option for anyone since the food items required of such a diet simply weren’t available enough to sustain life, certainly not in places like England or Germany.

There is another side to this bring us back to the obsession with fat. It was only with the gradual increase of grain production that cattle could be fed grain, not only as additional feed in the winter but year round. This is also what allowed the possibility of confining animals, rather than grazing them on fields. Grain surpluses weren’t consistent until the 19th century, but even before that grain production had been increasing. There were slow improvements in agriculture over the centuries. The rich could afford meat from grain-fed animals much earlier than the rest of the population and it was highly sought after. That is because such meat is extremely fatty creating those beautiful marbled steaks, pork chops, etc (such fattiness, by the way, is a sign of metabolic syndrome in both animals and humans). Fat couldn’t have been a focus of debate prior to grain-fattened animals became common.

So, there is a reason that both wheat bread and fatty meat gained immense symbolic potency at the same time. Similarly, it was during this same era that vegetables became more common and gardens likewise became symbols of wealth, abundance, and the good life. Only the rich could afford to maintain large gardens because of the difficulty involved and immense time-consuming work required (see The Jane Austen Diet by Bryan Kozlowski**; also about the American diet before the 20th century, see The Big Fat Surprise by Nina Teicholz that I quote in Malnourished Americans). They represented the changed diet of modern civilization. They were either indicators of progress or decline, depending on one’s perspective. Prior to modernity, a diet had consisted to a much greater degree of foods that were gathered, hunted, trapped, and fished.

The shift from one source of food to another changed the diet and so changed the debate about diet. There suddenly were more options of foods available as choices to argue about. Diet as a concept was being more fully formulated. Rather than being something inherited according to the traditional constraints of local food systems and food customs, assuming one had the wealth, one could pick from a variety of possible diets. Even to this day, the obsession about dieting carries a taint of class privilege. It is, as they say, a first world problem. But what is fascinating is how this way of thinking took hold in the 1600s and 1700s. There was a modern revolution in dietary thought in the generations before modern political revolution. The old order was falling apart and sometimes actively being dismantled. This created much anxiety and it forced the individual into a state of uncertainty. Old wisdom no longer could be relied upon.

* * *

*Rather than bread, the food that was most associated with the laboring class was fish, a food the wealthy avoided. Think about how lobster and clams used to be poverty foods. In Galenic theory of humoral physiology, fish is considered cold and wet, hard to digest and weakening. This same humoral category of food also included fruits and vegetables. This might be why, even to this day, many vegetarians and vegans will make an exception for fish, in seeing it as different than ‘meat’. This is an old ideological bias because ‘meat’ was believed to have the complete opposite effect of being hot and dry, easy to digest and invigorating. This is the reason for why meat but not fish was often banned during religious fasts and festivals.

As an interesting side note, the supposed cooling effect of fish was a reason for not eating it during the cold times of the year. Fish is one of the highest sources of vitamin A. Another source is by way of the precursor of beta-carotene found in vegetables. That these two types of food are considered of the same variety according to Galenic thought is interesting. Cold weather is one of the factors that can disrupt the body’s ability to convert beta-carotene into usable vitamin A. The idea of humors mixes this up slightly, but it maybe points to understanding there was something important to be understood. Eating more meat, rather than vegetables, in winter is a wise practice in a traditional society that can’t supplement such nutrients. Vitamin A is key for maintaining a strong immune system and handling stress (True Vitamin A For Health And Happiness).

By the way, it was during the 19th century that a discussion finally arose about vegetarianism. The question was about whether life and health could be sustained with vegetables. Then again, those involved were probably still being influenced by Galenic thought. By vegetarianism, they likely meant a more general plant-based diet that excluded ‘meat’ but not necessarily fish. The context of the debate was the religious abstinence of Lent, during which fish was allowed. So, maybe the fundamental argument was more about the possibility of long-term survival solely on moist, cooling foods. Whatever the exact point of contention, it was the first time in the modern Western world where a plant-based diet (be it vegan, vegetarian, or pescetarian-style Mediterranean diet) was considered seriously.

These ideas have been inherited by us, even though the philosophical justifications no longer make sense to us. This is seen in the debate that continues over red meat in particular and meat in general, specifically in terms of the originally Galenic assertion of its heat and dryness building up the ‘blood’ (High vs Low Protein). It’s funny that dietary debates remain obsessed over red meat (along with the related issue of cows and their farts), even though actual consumption of red meat has declined over the past century. As with bread, the symbolic value of red meat has maybe even gained greater importance. Similarly, as I mentioned above, the uncertain categorization of fish remains hazy. I know a vegan who doesn’t eat ‘meat’ but does eat fish. When I noted how odd that was, a vegetarian I was talking to thought it made perfect sense. This is Galenic thought without the Galenic theory that at least made it a rational position, but the ideological bias remains in spite of those adhering to it being unable to explain why they hold that bias. It amuses me.

Ideologies are powerful systems. They are mind viruses that can survive and mutate across centuries and sometimes millennia. Most of the time, their origins are lost to history. But sometimes we are able to trace them and it makes for strange material to study.

See: “Fish in Renaissance Dietary Theory” by Ken Albala from Fish: Food from the Waters ed. by Harlan Walker, and Food and Faith in Christian Culture ed. by Ken Albala and Trudy Eden. Also, read text below, such as the discussion of vegetarianism.

* * *

(Both texts below are from collections that are freely available on Google Books and possibly elsewhere.)

The Fat of the Land: Proceedings of the Oxford Symposium on Food and Cooking 2002
ed. by Harlan Walker
“The Apparition of Fat in Western Nutritional Theory”
by Ken Albala

Naturally dietary systems of the past had different goals in mind when framing their recommendations. They had different conceptions of the good, and at some point in history that came to include not being fat. Body size then became an official concern for dietary writers. Whether the original impetus for this change was a matter of fashion, spirituality or has its roots in a different approach to science is impossible to say with any degree of precision. But this paper will argue that nutritional science itself as reformulated in the 17th century was largely to blame for the introduction of fat into the discourse about how health should be defined. […] Obesity is a pathological state according to modern nutritional science. But it was not always so.

When and why fat became a medical issue has been a topic of concern among contemporary scholars. Some studies, such as Peter N. Sterns’ Fat History: Bodies and Beauty in the Modern West, place the origin of our modern obsession in the late 19th century when the rise of nutritional science and health movements lead by figures like John Harvey Kellogg, hand in hand with modern advertising and Gibson Girls, swept away the Victorian preference for fulsome figures. As a form of social protest, those who could afford to, much as in the 60s, idealized the slim androgynous figure we associate with flappers. Others push the origin further back into the early 19th century, in the age of Muscular Christianity and Sylvester Graham. But clearly the obsession is earlier than this. In the 18th century the 448 pound physician George Cheyne and his miracle dieting had people flocking to try out the latest ‘cures.’ It was at the same time that dissertations on the topic of obesity became popular, and clearly the medical profession had classified this as a treatable condition. And readers had already been trained to monitor and police their own bodies for signs of impending corpulence. The roots of this fear and guilt must lie somewhere in the previous century as nutritional science was still groping its way through a myriad of chemical and mechanical theories attempting to quantify health and nutrition with empirical research.

The 17th century is also the ideal place to look if only because the earlier system of humoral physiology is almost totally devoid of a concept of fat as a sickness. […]

For all authors in the Galenic tradition it appears that fat was seen as a natural consequence of a complexion tending to the cold and moist, something which could be corrected, but not considered an illness that demanded serious attention. And socially there does not seem to have been any specific stigma attached to fat if Rubens’ taste in flesh is any measure.

The issue of fat really only emerges among authors who have abandoned, in part or totally, the system of humoral physiology. This seems to have something to do with both the new attempts to quantify nutrition, first and most famously by Santorio Santorio9 and also among those who began to see digestion and nutrition as chemical reactions which when gone awry cast fatty deposits throughout the body. It was only then that fat came to be considered a kind of sickness to be treated with therapy.10

The earliest indications that fat was beginning to be seen as a medical problem are found in the work of the first dietary writer who systematically weighed himself. Although Santorio does not seem to have been anxious about being overweight himself, he did consistently define health as the maintenance of body weight. Expanding on the rather vague concept of insensible perspiration used by Galenic authors, Santorio sought to precisely measure the amount of food he consumed each day compared to the amount excreted in ‘sensible’ evacuations. […] Still, fat was not a matter of eating too much. ‘He who eats more than he can digest, is nourished less than he ought to be, and [becomes] consequently emaciated.’12 More importantly, fat was a sign of a system in disarray. […]

Food was not in fact the only factor Santorio or his followers took into account though. As before, the amount of exercise one gets, baths, air quality, even emotions could alter the metabolic rate. But now, the effect of all these could be precisely calculated. […]

At the same time that these mechanistic conceptions of nutrition became mainstream, a chemical understanding of how food is broken down by means of acids and alkalis also came to be accepted by the medical profession. These ideas ultimately harked back to Paracelsus writing in the 16th century but were elaborated upon by 17th century writers […] It is clear that by the early 18th century fat could be seen as a physiological defect that could be corrected by heating the body to facilitate digestive fermentation and the passage of insensible perspiration. […] Although the theories themselves are obviously nothing like our own, we are much closer to the idea of fat as a medical condition. […]

Where Cheyne departs from conventional medical opinion, is in his recommendation of a cooked vegetable diet to counter the affects of a disordered system, which he admits is rooted in his own ‘experience and observation on my own crazy carcase and the infirmities of others I have treated’ rather than on any theoretical foundation.

The controversy over whether vegetables could be considered a proper diet, not only for the sick or overgrown but for healthy individuals, was of great concern in the 18th century. Nicholas Andry in his Traité des alimens de caresme offered an extended diatribe against the very notion that vegetables could sustain life, a question of particular importance in Catholic France where Lenten restriction were still in force, at least officially. […] According to current medical theory, vegetables could not be suitable for weight loss, despite the successful results of the empirics. […]

It is clear that authors had a number of potentially conflicting theoretical models to draw from and both mechanical and chemical explanations could be used to explain why fat accumulates in the body. Yet with entirely different conceptual tools, these authors arrived at dietary goals surprisingly like our own, and equally as contentious. The ultimate goals now became avoiding disease and fat, and living a long life. While it would be difficult to prove that these dietary authors had any major impact beyond the wealthy elites and professionals who read their works, it is clear that a concern over fat was firmly in place by the mid 18th century, and appears to have its roots in a new conception of physiology which not only paid close attention to body weight as an index of health, but increasingly saw fat as a medical condition.

Food and Morality: Proceedings of the Oxford Symposium on Food and Cookery 2007
ed. by Susan R. Friedland
“Moral Fiber: Bread in Nineteenth-Century America”

by Mark McWilliams

From Sarah Josepha Hale, who claimed, ‘the more perfect the bread, the more perfect the lady’ to Sylvester Graham, who insisted, ‘the wife, the mother only’ has the ‘moral sensibility’ required to bake good bread for her family, bread often became a gendered moral marker in nineteenth-century American culture.1 Of course, what Hale and Graham considered ‘good’ bread differed dramatically, and exactly what constituted ‘good’ bread was much contested. Amidst technological change that made white flour more widely available and home cooking more predictable, bread, described in increasingly explicit moral terms, became the leading symbol of a housewife’s care for her family.

Americans were hardly the first to ascribe moral meaning to their daily bread. As Bernard Dupaigne writes, ‘since time immemorial [bread] has attended the great events of various human communities: monsoon or grape harvest bread, the blessed bread of Catholics or the unleavened bread of Passover, or the fasting-break bread of Ramadan. There is no bread that does not, somewhere in the world, celebrate an agricultural or religious holiday, enrich a family event, or commemorate the dead.’2 With such varied symbolic resonance, bread seems easily filled with new meanings.

In America (as later in France),3 bread became a revolutionary symbol. To the early English colonists’ dismay, European wheat did not adapt well to the North American climate; the shift to corn as the primary grain was perhaps the most important dietary adaptation made by the colonists. Wheat remained too expensive for common consumption well into the nineteenth century. […]

By the end of the Revolution, then, bread was already charged with moral meaning in the young United States. In the nineteenth century, this meaning shifted in response to agricultural improvements that made wheat more widely available, technological change that made bread easier to make consistently, and, perhaps most important, social change that made good bread the primary symbol of a housewife’s care for her family. In effect, bread suffered a kind of identity crisis that paralleled the national identity crisis of Jacksonian America. As Americans thought seriously about who they were in this new nation, about how they should act and even how they should eat, bread’s symbolic meaning – and bread itself– changed.

American agricultural production exploded, although the proportion of the population working on farms declined. James Trager notes that even before the McCormick reaper first sold in large numbers as farmers struggled to replace workers leaving for the 1849 Gold Rush, the average time required to produce a bushel of wheat declined 22 per cent from 1831 to 1840.7 Dramatic improvements in efficiency led to larger yields; for example, wheat production more than doubled between 1840 and 1860. Such increases in wheat production, combined with better milling procedures, made white flour finally available in quantities sufficient for white bread to become more than a luxury good.8

Even as wheat became easier to find for many Americans, bread remained notoriously difficult to make, or at least to make well. Lydia Maria Child, a baker’s daughter who became one of America’s leading writers, emphasizes what must have been the intensely frustrating difficulty of learning to cook in the era before predictable heat sources, standardized measurements, and consistent ingredients.9 […]

Unlike Hale, who implies that learning to bake better can be a kind of self improvement, this passage works more as dire warning to those not yet making the proper daily bread. Though bread becomes the main distinction between the civilized and the savage, Beecher turns quickly, and reassuringly, to the science of her day: ‘By lightness is meant simply that in order to facilitate digestion the particles are to be separated from each other by little holes or air-cells; and all the different methods of making light bread are neither more nor less than the formation of bread with these air cells’ (170). She then carefully describes how to produce the desired lightness in bread, instructions which must have been welcome to the young housewife now fully convinced of her bread’s moral importance.

The path for Beecher, Hale, and others had been prepared by Sylvester Graham, although he is little mentioned in their work.14 In his campaign to improve bread, Graham’s rhetoric ‘romanticized the life of the traditional household’ in ways that ‘unknowingly helped prepare women to find a new role as guardians of domestic virtue,’ as Stephen Nissenbaum notes.15 Bread was only one aspect of Graham’s program to educate Americans on what he called ‘the Science of Human Life.’ Believing on the one hand, unlike many at the time, that overstimulation caused debility and, on the other, that industrialization and commercialization were debasing modern life, Graham proposed a lifestyle based around a strict controls on diet and sexuality.16 While Graham promoted a range of activities from vegetarianism to temperance, his emphasis on good bread was most influential. […]

And yet modern conditions make such bread difficult to produce. Each stage of the process is corrupted, according to Graham. Rather than grow wheat in ‘a pure virgin soil’ required for the best grain, farmers employ fields ‘exhausted by tillage, and debauched by the means which man uses to enrich and stimulate it.’ As Nissenbaum notes, the ‘conscious sexual connotations’ of Graham’s language here is typical of his larger system, but the language also begins to point to the moral dimensions of good bread (6).

Similarly loaded language marks Graham’s condemnation of bakery bread. Graham echoed the common complaints about adulteration by commercial bakers. But he added a unique twist: even the best bakery bread was doubly flawed. The flour itself was inferior because it was over-processed, according to Graham: the ‘superfine flour’ required for white bread ‘is always far less wholesome, in any and every situation of life, than that which is made of wheaten meal which contains all the natural properties of the grain.’ […]

As Nissenbaum argues, pointing to this passage, Graham’s claims invoke ‘the vision of a domestic idyll, of a mother nursing her family with bread and affection’ (8). Such a vision clearly anticipates the emphasis on cookery as measure of a woman’s social worth in the domestic rhetoric that came so to characterize the mid-nineteenth century.

Such language increasingly linking cookery with morality emphasized the virtue not of the food itself but rather of the cooks preparing it. This linkage reached read ers not only through the explosion of cookbooks and domestic manuals but also through the growing numbers of sentimental novels. Indeed, this linkage provided a tremendously useful trope for authors seeking a shorthand to define their fictional characters. And that trope, in turn, helped expand the popularity of interpreting cookery in moral terms. […]

After the Civil War, domestic rhetoric evolved away from its roots in the wholesome foods of the nation’s past toward the ever-more refined cuisine of the Gilded Age. Graham’s refusal to evolve in this direction – his system was based entirely in a nostalgic struggle against modernity, against refinement – may well be a large part of why his work was quickly left behind even by those for whom it had paved the way.

* * *

Here is another text I came across. It’s not free, but it seems like a good survey worth buying.

 

 

American Heart Association’s “Fat and Cholesterol Counter” (1991)

  • 1963 – “Every woman knows that carbohydrates are fattening, this is a piece of common knowledge, which few nutritionists would dispute.”
  • 1994 – “… obesity may be regarded as a carbohydrate-deficiency syndrome and that an increase in dietary carbohydrate content at the expense of fat is the appropriate dietary part of a therapeutical strategy.”*

My mother was about to throw out an old booklet from the American Heart Association (AHA), “Fat and Cholesterol Counter”, one of several publications they put out around that time. It was published in 1991, the year I started high school. Unsurprisingly, it blames everything on sodium, calories, cholesterol, and, of course, saturated fat.

Even hydrogenated fat gets blamed on saturated fat, since the hydrogenation process turns some small portion of it saturated, which ignores the heavy damage and inflammatory response caused by the oxidization process (both in the industrial processing and in cooking). Not to mention those hydrogenated fats as industrial seed oils are filled with omega-6 fatty acids, the main reason they are so inflammatory. Saturated fat, on the other hand, is not inflammatory at all. This obsession with saturated fat is so strange. It never made any sense from a scientific perspective. When the obesity epidemic began and all that went with it, the consumption of saturated fat by Americans had been steadily dropping for decades, ever since the invention of industrial seed oils in the late 1800s and the fear about meat caused by Upton Sinclair’s muckraking journalism, The Jungle, about the meatpacking industry.

The amount of saturated fat and red meat has declined over the past century, to be replaced with those industrial seed oils and lean white meat, along with fruits and vegetables — all of which have been increasing.** Chicken, in particular, replaced beef and what stands out about chicken is that, like those industrial seed oils, it is high in the inflammatory omega-6 fatty acids. How could saturated fat be causing the greater rates of heart disease and such when people were eating less of it. This scapegoating wasn’t only unscientific but blatantly irrational. All of this info was known way back when Ancel Keys went on his anti-fat crusade (The Creed of Ancel Keys). It wasn’t a secret. And it required cherrypicked data and convoluted rationalizations to explain away.

Worse than removing saturated fat when it’s not a health risk is the fact that it is actually an essential nutrient for health: “How much total saturated do we need? During the 1970s, researchers from Canada found that animals fed rapeseed oil and canola oil developed heart lesions. This problem was corrected when they added saturated fat to the animals diets. On the basis of this and other research, they ultimately determined that the diet should contain at least 25 percent of fat as saturated fat. Among the food fats that they tested, the one found to have the best proportion of saturated fat was lard, the very fat we are told to avoid under all circumstances!” (Millie Barnes, The Importance of Saturated Fats for Biological Functions).

It is specifically lard that has been most removed from the diet, and this is significant as lard was a central to the American diet until this past century: “Pre-1936 shortening is comprised mainly of lard while afterward, partially hydrogenated oils came to be the major ingredient” (Nina Teicholz, The Big Fat Surprise, p. 95); “Americans in the nineteenth century ate four to five times more butter than we do today, and at least six times more lard” (p. 126). And what about the Mediterranean people who supposedly are so healthy because of their love of olive oil? “Indeed, in historical accounts going back to antiquity, the fat more commonly used in cooking in the Mediterranean, among peasants and the elite alike, was lard.” (p. 217).

Jason Prall notes that long-lived populations ate “lots of meat” and specifically, “They all ate pig. I think pork was the was the only common animal that we saw in the places that we went” (Longevity Diet & Lifestyle Caught On Camera w/ Jason Prall). The infamous long-lived Okinawans also partake in everything from pigs, such that their entire culture and religion was centered around pigs (Blue Zones Dietary Myth). Lard, in case you didn’t know, comes from pigs. Pork and lard is found in so many diets for the simple reason pigs can live in diverse environments, from mountainous forests to tangled swamps to open fields, and they are a food source available year round.

Another thing that has gone hand in hand with loss of healthy, nutrient-dense saturated fat in the American diet is a loss of nutrition in general. It’s not only that plant foods have less minerals and vitamins because of depleted soil and because they are picked when not ripe in order to ship them long distances. The same is true of animal foods, since the animals are being fed the same crappy plant foods as us humans. But at the very least, even factory-farmed animals have far more bioavailable nutrient-density than plant foods from industrial agriculture. If we ate more fatty meat, saturated fat or otherwise, we’d be getting far more fat-soluble vitamins. But when looking at all animal foods, in particular from pasture-raised and wild-caught sources, there is no mineral or vitamin that can’t be obtained at required levels. The same can’t be said for plant foods on a vegan diet.

Back in 1991, the AHA was recommending the inclusion of lots of bread, rolls, crackers, and pasta (“made with low-fat milk and fats or oils low in saturated fatty acids” and “without eggs”); rice, beans, and peas; sugary fruits and starchy vegetables (including juices) — and deserts were fine as well. At most, eat 3 or 4 eggs a week and, as expected, optimally avoid the egg yolks where all the nutrition is located (not only fat-soluble vitamins, but also choline and cholesterol and much else; by the way, your brain health is dependent on high levels of dietary cholesterol, such that statins in blocking cholesterol cause neurocognitive decline). As long as there were little if any saturated fat and fat in general was limited, buckets of starchy carbs and sugar was considered by the AHA to be part of a healthy and balanced diet. That is sad.

This interested me because of the year. This was as I was entering young adulthood and so I was becoming more aware of the larger world. I remember the heavy-handed propaganda preaching that fiber is good and fat is evil, as if the war on obesity was a holy crusade that demanded black-and-white thinking, all subtleties and complexities must be denied in adherence to the moralistic dogma against the sins of gluttony and sloth — it was literally a evangelistic medical gospel (see Belinda Fettke’s research on the Seventh Day Adventists: Thou Shalt not discuss Nutrition ‘Science’ without understanding its driving force). In our declining public health, we were a fallen people who required a dietary clergy for our salvation. Millennia of traditional dietary wisdom and knowledge was thrown out the window as if it was worthless or maybe even dangerous.

I do remember my mother buying high-fiber cereals and “whole wheat” commercial breads (not actually whole wheat as it is simply denatured refined flour with fiber added back in). And along with this, skim or 1% fat dairy foods, especially milk, was included with every major meal and often snacks. I had sugary and starchy cereal with skim milk (and/or milk with sugary Instant Breakfast) every morning and a glass of skim milk for every dinner, maybe sometimes milk for lunch. Cheese was a regular part of the diet as well, such as with pizza eaten multiple times week or any meal with pasta, and heck cheese was a great snack all by itself, but also good combined with crackers and one could pretend to be healthy if one used Triscuits. Those were the days when I might devour a whole block of cheese, probably low-fat, in a single sitting — I was probably craving fat-soluble vitamins. Still, most of my diet was most starches and sugar, as that was my addiction. The fiber was an afterthought to market junk food as health food.

It now makes sense. When I was a kid in the 1980s, my mother says the doctor understood that whole fat milk was important for growing bodies. So that is what he recommended. But I guess the anti-fat agenda had fully taken over by the 1990s. The AHA booklet from 1991 was by then recommending “skim or 1% milk and low-fat cheeses” for all ages, including babies and children, pregnant and lactating women. Talk about a recipe for health disaster. No wonder metabolic syndrome exploded and neurocognitive health fell like a train going over a collapsed bridge. It was so predictable, as the failure of this diet was understood by many going back to earlier in the century (e.g., Weston A. Price; see my post Health From Generation To Generation).

The health recommendations did get worse over time, but to be fair it started much earlier. They had been discouraging breastfeeding for a while. Traditionally, babies were breastfed for the first couple of years or so. By the time modern America came around, experts were suggesting a short period of breast milk or even entirely using scientifically-designed formulas. My mother only breastfed me for 5-6 months and then put me on cows milk — of course, pasteurized and homogenized milk from grain-fed and factory-farmed cows. When the dairy caused diarrhea, the doctor suggested soy milk. After a while, my mother put me on dairy again, but diarrhea persisted and so for preschool she put me back on soy milk again. I was drinking soy milk off and on for many years during the most important stage of development. Holy fuck! That had to have done serious damage to my developing body, in particular my brain. Then I went from that to skim milk during another important time of development, as I hit puberty and went through growth spurts.

Early on in elementary school, I had delayed reading and a diagnosis of learning disability, seemingly along with something along the lines of either Asperger’s or specific language impairment, although undiagnosed. I definitely had social and behavioral issues, in that I didn’t understand people well when I was younger. Then entering adulthood, I was diagnosed with depression and something like a “thought disorder” or something (I forget the exact diagnosis I got while in a psychiatric ward after a suicide attempt). No doubt the latter was already present in my early neurocogntive problems, as I obviously was severely depressed at least as early as 7th grade. A malnourished diet of lots of carbs and little fat was the most probable cause for all of these problems.

Thanks, American Heart Association! Thanks for doing so much harm my health and making my life miserable for decades, not to mention nearly killing me through depression so severe I attempted suicide, and then decades of depressive struggle that followed. That isn’t even to mention the sugar and carb addiction that plagued me for so long. Now multiply my experience by that of at least hundreds of millions of other Americans, and even greater number of people from elsewhere as their governments followed the example of the United States, across the past few generations. Great job, AHA. And much appreciation for the helping hand of the USDA and various medical institutions in enforcing this anti-scientific dogma.

Let me be clear about one thing. I don’t blame my mother, as she was doing the best she could with the advice given to her by doctors and corporate media, along with the propaganda literature from respected sources such as the AHA. Nor do I blame any other average Americans as individuals, although I won’t hold back on placing the blame squarely on the shoulders of demagogues like Ancel Keys. As Gary Taubes and Nina Teicholz have made so clear, this was an agenda of power, not science. With the help of government and media, the actual scientific debate was silenced and disappeared from public view (Eliminating Dietary Dissent). The consensus in favor of a high-carb, low-fat diet didn’t emerge through rational discourse and evidence-based medicine —  it was artificially constructed and enforced.

Have we learned our lesson? Apparently not. We still see this tactic of technocratic authoritarianism, such as with corporate-funded push behind EAT-Lancet (Dietary Dictocrats of EAT-Lancet). Why do we tolerate this agenda-driven exploitation of public trust and harm to public health?

* * *

 * First quote: Passmore, R., and Y. E. Swindelis. 1963. “Observations on the Respiratory Quotients and Weight Gain of Man After Eating Large Quantities of Carbohydrates.” British Journal of Nutrition. 17. 331-39.
Second quote: Astrup, A., B. Baemann, N. . Christenson, and S. Toubre. 1994. “Failure to Increase Lipid Oxidtion in Response to Increasing Dietary Fat Content in Formerly Obese Women.” American Journal of Physiology. April, 266 (4, pt. 1) E592-99.
Both quotes are from a talk given by Peter Ballerstedt, “AHS17 What if It’s ALL Been a Big Fat Lie?,” available on the Ancestry Foundation Youtube page.

(It appears that evidence-based factual reality literally changes over time. I assume this relativity of ideological realism has something to do with quantum physics. It’s the only possible explanation. I’m feeling a bit snarky, in case you didn’t notice.)

** Americans, in the prior centuries, ate few plant foods at all because they were so difficult and time-consuming to grow. There was no way to control for pests and wild animals that often would devour and destroy a garden or a crop. It was too much investment for too little reward, not to mention extremely unreliable as a food source and so risky to survival for those with a subsistence lifestyle. Until modern farming methods, especially with 20th century industrialization of agriculture, most Americans primarily ate animal foods with tons of fat, mostly butter, cream and lard, along with a wide variety of wild-caught animal foods.

This is discussed by Nina Teicholz in The Big Fat Surprise: “Early-American settlers were “indifferent” farmers, according to many accounts. They were fairly lazy in their efforts at both animal husbandry and agriculture, with “the grain fields, the meadows, the forests, the cattle, etc, treated with equal carelessness,” as one eighteenth-century Swedish visitor described. And there was little point in farming since meat was so readily available.” (see more in my post Malnourished Americans). That puts the conventional dietary debate in an entirely different context. Teicholz adroitly dismantles the claim that fatty animal foods have increased in the American diet.

Teicholz goes on to state that, “So it seems fair to say that at the height of the meat-and-butter-gorging eighteenth and nineteenth centuries, heart disease did not rage as it did by the 1930s. Ironically—or perhaps tellingly—the heart disease “epidemic” began after a period of exceptionally reduced meat eating.” It was the discovery of seed oils that originally were an industrial byproduct, combined with Upton Sinclair’s muckraking journalism about the meatpacking industry (The Jungle), that caused meat and animal fats to quickly fall out as the foundation of the American diet. Saturated fat, in particular, had been in decline for decades prior to the epidemics of obesity, diabetes, and heart disease. Ancel Keys knew this data, which is why he had to throw out some of his data to make it fit his preconceived conclusions in promoting his preferred dietary ideology.

If we were honestly wanting to find the real culprit to blame, we would look to the dramatic rise of vegetable oils, white flour, and sugar in the 20th century diet. It began much earlier with the grain surpluses and cheap wheat, especially in England during the 1800s, but in the United States it became most noticeable in the first half century following that period. The agenda of Keys and the AHA simply made a bad situation worse, albeit much much worse.

Blake in an Age of Paine

“Paine is either a Devil or an Inspired man.”

“…the Holy Ghost who in Paine strives with Christendom as in Christ he strove with the Jews.”

“Is it a greater miracle to feed five thousand men with five loaves than to overthrow all the armies of Europe with a small pamphlet?”

“Christ died an unbeliever and if the Bishops had their way so would Paine.”

Those are quotes of William Blake writing about Thomas Paine. Blake didn’t agree with Paine’s deism. But his writings show he was quite familiar with Paine’s work and saw their influence in a positive light.

Although the story of Blake warning Paine of impending arrest might not be true, they were part of the same social circle. Still, some like to imagine what an encounter between them might have been like — here is the play In Lambeth by Jack Shepherd:

Blake: Prophet Against Empire
by David V. Erdman

Blake’s Margins: An Interpretive Study of the Annotations
by Hazard Adams

Ideology and Utopia in the Poetry of William Blake
by Nicholas M. Williams

“There is a Grain of Sand in Lambeth that Satan cannot find”, William Blake meets Thomas Paine. Dramatisation Of the Play. In Lambeth.
by Roger G. Lewis

Deists 1
by Larry Clayton

William Blake, Thomas Paine and the Bible
by Golgonooza

“To Defend the Bible in This Year 1798 Would Cost a Man His Life”
by Morton D. Paley

Flames in the Night Sky : Blake, Paine and the Meeting of the Society of Loyal Britons, Lambeth, October 10th, 1793
by Michael Phillips

Blake’s Jerusalem
by Judy Cox

Blake and Paine: Devils or Inspired Men?
by Humberto Garcia

Blake, Moravianism, and Thomas Paine: Expanding on Anna’s Previous Argument
by Viv Alexandra

Brothers in Pen
by Andy Tang

The Pain of Will
by Daniel Lizaola Lopez

Liberté, égalité, fraternité
by Beyanira Bautista

Religion and Politics
by Israel Alonso

“A Mark was Made”

Here in Iowa City, we’ve been in a permanent state of construction for years now. I can’t remember the last time some part of the downtown wasn’t in disarray while in the process of being worked on. Large parts of the pedestrian mall have been a maze of fencing and torn up brick for years upon years (Michael Shea, Ped Mall updates soon to come). An entire generation of Iowa Citians has grown up with construction as their childhood memory of the town.

For a smaller town with a population of only 75,798, the city government impressively throws millions of dollars at projects like it’s pocket change. The pedestrian mall renovation alone is projected to be $7.4 million and that’s limited to about a block of the downtown. The city has had many similar projects in recent years, including the building of multiple massive parks and a city-wide infrastructure overhaul, to name a few. Over the past decade or so, the city expenditures for these kinds of improvements might add up to hundreds of millions of dollars. That is a lot of money for such a limited area, considering one can take a relaxed stroll from one side of town to the other in a couple of hours or less.

All of this public investment is called progress, so I hear. As part of this project to improve and beautify the downtown, they apparently built a wall as a memorial to very important people (a wall to Make Iowa City Great Again?). It’s entitled entitled “A Mark was Made”. From the official City of Iowa City website, it is reported that, “The wall was created to become an evolving acknowledgement celebrating the leadership, activism, and creativity of those who have influenced the Iowa City community and beyond” (Installation of ‘A Mark was Made’ story wall completed as part of Ped Mall project).

One of the local figures included is John Alberhasky, now deceased. He was a respectable member of the local capitalist elite and still well-remembered by many. For the older generations who are fond of what capitalism once meant, this is the kind of guy they’re thinking of. Apparently, I’m now officially part of the “older generations”, as I can recall what Iowa City used to be like… ah, the good ol’ days.

Mr. Alberhasky was not only a small business owner but also a widely known community leader. The small mom-and-pop grocery store that he started, affectionately known as “Dirty John’s”, has long been a regularly stop even for people not living in the neighborhood and the store’s deli used to make sandwiches that were sold at a local high school. Once among dozens of such corner grocery stores, it is the only example left remaining in this town. The store itself is a memorial to a bygone era.

This local businessman seems like a worthy addition to this memorial. He was beloved by the community. And he seems to have established an honorable family business that is being carried on with care by his descendants. There are few families that have been part of the Iowa City community for so long, going back to the 1800s, the kinds of ethnic immigrants that built this country. They are good people, the best landlords I’ve ever had I might add (as a tenant for a couple of decades, does that make me their landpeasant?). I approve of their family’s patriarch being included on this fine wall of public distinction.

Still, I can’t help but noting an irony about this memorial to community involvement and public service. It is located in the People’s Park that was turned into the gentrified front yard of a TIF-funded high-rise built for rich people (TIFs, Gentrification, and Plutocracy). It effectively evicted the common folk from this public park for years and a once thriving community space has never been the same since (Freedom and Public Space). Only recently did they finally put seating back to allow the dirty masses to once again rest their weary bodies, but it has yet to regain the welcoming feel it once held as a vibrant expression of community.

To this day, there is no memorial or even a small plaque indicating that this is a unique park separate from and having preceded the pedestrian mall, originally a green space that was established through community organizing and public demand, the first public space established downtown. It’s as if the People’s Park does not exist and, as far as public memory goes, never did exist. The number of people who remember it are growing fewer in number.

Not even the local government will officially acknowledge it. In the article about the new wall from the city website, they don’t mention that this is the People’s Park and, instead, refer to it as merely Black Hawk Mini Park. I did a quick site search and the People’s Park is not mentioned by name anywhere on the city website. But at least Chief Black Hawk gets mentioned for his role in surrendering to the US military that allowed white people to take his people’s land… that’s something.

Oil Industry Knew About Coming Climate Crisis Since 1950s

“Even now, man may be unwittingly changing the world’s climate through the waste products of his civilization. Due to our release through factories and automobiles every year of 6 billion tons of carbon dioxide (CO2), which helps air absorb heat from the sun. Our atmosphere seems to be getting warmer.”
~Unchained Goddess, film from Bell Telephone Science Hour (1958)

“[C]urrent scientific opinion overwhelmingly favors attributing atmospheric carbon dioxide increase to fossil fuel combustion.”
~James F. Black, senior scientist in the Products Research Division of Exxon Research and Engineering, from his presentation to Exxon corporate management entitled “The Greenhouse Effect” (July, 1977)

“Data confirm that greenhouse gases are increasing in the atmosphere. Fossil fuels contribute most of the CO2.”
~Duane G. Levine, Exxon scientist, presentation to the Board of Directors of Exxon entitled “Potential Enhanced Greenhouse Effects: Status and Outlook” (February 22, 1989)

“Scientists also agree that atmospheric levels of greenhouse gases (such as C02) are increasing as a result of human activity.”
~Oil industry Global Climate Coalition, internal report entitled “Science and Global Climate Change: What Do We Know? What are the Uncertainties?” (early 1990s)

“The scientific basis for the Greenhouse Effect and the potential impact of human emissions of greenhouse gases such as CO2 on climate is well established and cannot be denied.”
~Oil industry group Global Climate Coalition’s advisory committee of scientific and technical experts reported in the internal document “Predicting Future Climate Change: A Primer”, written in 1995 but redacted and censored version distributed in 1996 (see UCSUSA’s “Former Exxon Employee Says Company Considered Climate Risks as Early as 1981”)

“Perhaps the most interesting effect concerning carbon in trees which we have thus far observed is a marked and fairly steady increase in the 12C/13C ratio with time. Since 1840 the ratio has clearly increased markedly. This effect can be explained on the basis of a changing carbon dioxide concentration in the atmosphere resulting from industrialization and the consequent burning of large quantities of coal and petroleum.”
~Harrison Brown, a biochemist along with colleagues at the California Institute of Technology submitted a research proposal to the American Petroleum Institute entitled “The determination of the variations and causes of variations of the isotopic composition of carbon in nature” (1954)

“This report unquestionably will fan emotions, raise fears, and bring demand for action. The substance of the report is that there is still time to save the world’s peoples from the catastrophic consequence of pollution, but time is running out.
“One of the most important predictions of the report is carbon dioxide is being added to the Earth’s atmosphere by the burning of coal, oil, and natural gas at such a rate that by the year 2000, the heat balance will be so modified as possibly to cause marked changes in climate beyond local or even national efforts. The report further state, and I quote “. . . the pollution from internal combustion engines is so serious, and is growing so fast, that an alternative nonpolluting means of powering automobiles, buses, and trucks is likely to become a national necessity.””

~Frank Ikard, then-president of the American Petroleum Institute addressed
industry leaders at annual meeting, “Meeting the challenges of 1966” (November 8, 1965), given 3 days after the U.S. Science Advisory Committee’s official report, “Restoring the Quality of Our Environment”

“At a 3% per annum growth rate of CO2, a 2.5°C rise brings world economic growth to a halt in about 2025.”
~J. J. Nelson, American Petroleum Institute, notes from CO2 and Climate Task Force (AQ-9) meeting, meeting attended by attended by representatives from Exxon, SOHIO, and Texaco (March 18, 1980)

“Exxon position: Emphasize the uncertainty in scientific conclusions regarding the potential enhanced Greenhouse effect.”
~Joseph M. Carlson, Exxon spokesperson writing in “1988 Exxon Memo on the Greenhouse Effect” (August 3, 1988)

“Victory Will Be Achieved When
• “Average citizens understand (recognise) uncertainties in climate science; recognition of
uncertainties becomes part of the ‘conventional wisdom
• “Media ‘understands’ (recognises) uncertainties in climate science
• “Those promoting the Kyoto treaty on the basis of extant science appear to be out of touch
with reality.”
~American Petroleum Institute’s 1998 memo on denialist propaganda, see Climate Science vs. Fossil Fuel Fiction; “The API’s task force was made up of the senior scientists and engineers from Amoco, Mobil, Phillips, Texaco, Shell, Sunoco, Gulf Oil and Standard Oil of California, probably the highest paid and sought-after senior scientists and engineers on the planet. They came from companies that, just like Exxon, ran their own research units and did climate modeling to understand the impact of climate change and how it would impact their company’s bottom line.” (Not Just Exxon: The Entire Oil and Gas Industry Knew The Truth About Climate Change 35 Years Ago.)

[C]urrent scientific opinion overwhelmingly favors attributing atmospheric carbon dioxide increase to fossil fuel combustion. […] In the first place, there is general scientific agreement that the most likely manner in which mankind is influencing the global climate is through carbon dioxide release from the burning of fossil fuels. A doubling of carbon dioxide is estimated to be capable of increasing the average global temperature by from 1 [degree] to 3 [degrees Celsius], with a 10 [degrees Celsius] rise predicted at the poles. More research is needed, however, to establish the validity and significance of predictions with respect to the Greenhouse Effect. Present thinking holds that man has a time window of five to 10 years before the need for hard decisions regarding changes in energy strategies might become critical.
~James F. Black, senior scientist in the Products Research Division of Exxon Research and Engineering, from his presentation to Exxon corporate management entitled “The Greenhouse Effect” (July, 1977)

Present climactic models predict that the present trend of fossil fuel use will lead to dramatic climatic changes within the next 75 years. However, it is not obvious whether these changes would be all bad or all good. The major conclusion from this report is that, should it be deemed necessary to maintain atmospheric CO2 levels to prevent significant climatic changes, dramatic changes in patterns of energy use would be required.
~W. L. Ferrall, Exxon scientist writing in an internal Exxon memo, “Controlling Atmospheric CO2” (October 16, 1979)

In addition to the effects of climate on the globe, there are some particularly dramatic questions that might cause serious global problems. For example, if the Antarctic ice sheet which is anchored on land, should melt, then this could cause a rise in the sea level on the order of 5 meters. Such a rise would cause flooding in much of the US East Coast including the state of Florida and Washington D.C.
~Henry Shaw and P. P. McCall, Exxon scientists writing in an internal Exxon report, “Exxon Research and Engineering Company’s Technological Forecast: CO2 Greenhouse Effect” (Shaw, Henry; McCall, P. P. (December 18, 1980)

“but changes of a magnitude well short of catastrophic…” I think that this statement may be too reassuring. Whereas I can agree with the statement that our best guess is that observable effects in the year 2030 are likely to be “well short of catastrophic”, it is distinctly possible that the CPD scenario will later produce effects which will indeed be catastrophic (at least for a substantial fraction of the earth’s population). This is because the global ecosystem in 2030 might still be in a transient, headed for much significant effects after time lags perhaps of the order of decades. If this indeed turns out to be the case, it is very likely that we will unambiguously recognize the threat by the year 2000 because of advances in climate modeling and the beginning of real experimental confirmation of the CO2 problem.
~Roger Cohen, director of the Theoretical and Mathematical Sciences Laboratory at Exxon Research writing in inter-office correspondence “Catastrophic effects letter” (August 18, 1981)

In addition to the effects of climate on global agriculture, there are some potentially catastrophe events that must be considered. For example, if the Antarctic ice sheet which is anchored on land should melt, then this could cause e rise in sea level on the order of 5 meters. Such a rise would cause flooding on much of the U.S. East Coast, including the state of Florida and Washington, D.C. […]
The greenhouse effect ls not likely to cause substantial climactic changes until the average global temperature rises at least 1 degree Centigrade above today’s levels. This could occur in the second to third quarter of the next century. However, there is concern among some scientific groups that once the effects are measurable, they might not be reversible and little could be done to correct the situation in the short term. Therefore, a number of environmental groups are calling for action now to prevent an undesirable future situation from developing.
Mitigation of the “greenhouse effect” would require major reductions in fossil fuel combustion.
~Marvin B. Glaser, Environmental Affairs Manager, Coordination and Planning Division of Exxon Research and Engineering Company writing in “Greenhouse Effect: A Technical Review” (Glaser, M. B. (April 1, 1982)

In summary, the results of our research are in accord with the scientific consensus on the effect of increased atmospheric CO2 on climate. […]
Furthermore our ethical responsibility is to permit the publication of our research in the scientific literature. Indeed, to do otherwise would be a breach of Exxon’s public position and ethical credo on honesty and integrity.
~Roger W. Cohen, Director of Exxon’s Theoretical and Mathematical Sciences Laboratory, memo  “Consensus on CO2 Impacts” to A. M. Natkin, of Exxon’s Office of Science and Technology (Cohen, Roger W. (September 2, 1982)

[F]aith in technologies, markets, and correcting feedback mechanisms is less than satisfying for a situation such as the one you are studying at this year’s Ewing Symposium. […]
Clearly, there is vast opportunity for conflict. For example, it is more than a little disconcerting the few maps showing the likely effects of global warming seem to reveal the two superpowers losing much of the rainfall, with the rest of the world seemingly benefitting.
~Dr. Edward E. David, Jr., president of the Exxon Research and Engineering Company, keynote address to the Maurice Ewing symposium at the Lamont–Doherty Earth Observatory on the Palisades, New York campus of Columbia University, published in ““Inventing the Future: Energy and the CO2 “Greenhouse Effect”” (October 26, 1982)

Data confirm that greenhouse gases are increasing in the atmosphere. Fossil fuels contribute most of the CO2. […]
Projections suggest significant climate change with a variety of regional impacts. Sea level rise with generally negative consequences. […]
Arguments that we can’t tolerate delay and must act now can lead to irreversible and costly Draconian steps. […]
To be a responsible participant and part of the solution to [potential enhanced greenhouse], Exxon’s position should recognize and support 2 basic societal needs. First […] to improve understanding of the problem […] not just the science […] but the costs and economics tempered by the sociopolitical realities. That’s going to take years (probably decades).
~Duane G. Levine, Exxon scientist, presentation to the Board of Directors of Exxon entitled “Potential Enhanced Greenhouse Effects: Status and Outlook” (February 22, 1989)

* * *

To see more damning quotes from Exxon insiders, see Wikiquote page on ExxonMobil climate change controversy. Here are other resources:

We Made Climate Change Documentaries for Science Classes Way back in 1958 So Why Do Folks Still Pretend Not to Know?
from O Society

Report: Oil Industry Knew About Dangers of Climate Change in 1954
from Democracy Now! (see O Society version)

CO2’s Role in Global Warming Has Been on the Oil Industry’s Radar Since the 1960s
by Neela Banerjee

Exxon Knew about Climate Change 40 years ago
by Shannon Hall (see O Society version)

Industry Ignored Its Scientists on Climate
by Andrew C. Revkin

Exxon: The Road Not Taken
by Neela Banerjee, Lisa Song, & David Hasemyer

The Climate Deception Dossiers
(and full report)
from Union of Concerned Scientists

Exxon Has Spent $30+ Million on Think Tanks?
from Think Tank Watch

How Fossil Fuel Money Made Climate Change Denial the Word of God
by Brendan O’Connor (see O Society version)

A Timeline of Climate Science and Policy
by Brad Johnson

Autism and the Upper Crust

There are multiple folktales about the tender senses of royalty, aristocrats, and other elite. The most well known example is “The Princess and the Pea”. In the Aarne-Thompson-Uther system of folktale categorization, it gets listed as type 704 about the search for a sensitive wife. That isn’t to say that all the narrative variants of elite sensitivity involve potential wives. Anyway, the man who made this particular story famous is Hans Christian Andersen, having published his translation in 1835. He longed to be a part of the respectable class, but felt excluded. Some speculate that he projected his own class issues onto his slightly altered version of the folktale, something discussed in the Wikipedia article about the story:

“Wullschlager observes that in “The Princess and the Pea” Andersen blended his childhood memories of a primitive world of violence, death and inexorable fate, with his social climber’s private romance about the serene, secure and cultivated Danish bourgeoisie, which did not quite accept him as one of their own. Researcher Jack Zipes said that Andersen, during his lifetime, “was obliged to act as a dominated subject within the dominant social circles despite his fame and recognition as a writer”; Andersen therefore developed a feared and loved view of the aristocracy. Others have said that Andersen constantly felt as though he did not belong, and longed to be a part of the upper class.[11] The nervousness and humiliations Andersen suffered in the presence of the bourgeoisie were mythologized by the storyteller in the tale of “The Princess and the Pea”, with Andersen himself the morbidly sensitive princess who can feel a pea through 20 mattresses.[12]Maria Tatar notes that, unlike the folk heroine of his source material for the story, Andersen’s princess has no need to resort to deceit to establish her identity; her sensitivity is enough to validate her nobility. For Andersen, she indicates, “true” nobility derived not from an individual’s birth but from their sensitivity. Andersen’s insistence upon sensitivity as the exclusive privilege of nobility challenges modern notions about character and social worth. The princess’s sensitivity, however, may be a metaphor for her depth of feeling and compassion.[1] […] Researcher Jack Zipes notes that the tale is told tongue-in-cheek, with Andersen poking fun at the “curious and ridiculous” measures taken by the nobility to establish the value of bloodlines. He also notes that the author makes a case for sensitivity being the decisive factor in determining royal authenticity and that Andersen “never tired of glorifying the sensitive nature of an elite class of people”.[15]

Even if that is true, there is more going on here than some guy working out his personal issues through fiction. This princess’ sensory sensitivity sounds like autism spectrum disorder and I have a theory about that. Autism has been associated with certain foods like wheat, specifically refined flour in highly processed foods (The Agricultural Mind). And a high-carb diet in general causes numerous neurocognitive problems (Ketogenic Diet and Neurocognitive Health), along with other health conditions such as metabolic syndrome (Dietary Dogma: Tested and Failed) and insulin resistance (Coping Mechanisms of Health), atherosclerosis (Ancient Atherosclerosis?) and scurvy (Sailors’ Rations, a High-Carb Diet) — by the way, the rates of these diseases have been increasing over the generations and often first appearing among the affluent. Sure, grains have long been part of the diet, but the one grain that had most been associated with the wealthy going back millennia was wheat, as it was harder to grow which caused it to be in short supply and so expensive. Indeed, it is wheat, not the other grains, that gets brought up in relation to autism. This is largely because of gluten, though other things have been pointed to.

It is relevant that the historical period in which these stories were written down was around when the first large grain surpluses were becoming common and so bread, white bread most of all, became a greater part of the diet. But as part of the diet, this was first seen among the upper classes. It’s too bad we don’t have cross-generational data on autism rates in terms of demographic and dietary breakdown, but it is interesting to note that the mental health condition neurasthenia, also involving sensitivity, from the 19th century was seen as a disease of the middle-to-upper class (The Crisis of Identity), and this notion of the elite as sensitive was a romanticized ideal going back to the 1700s with what Jane Austen referred to as ‘sensibility’ (see Bryan Kozlowski’s The Jane Austen Diet, as quoted in the link immediately above). In that same historical period, others noted that schizophrenia was spreading along with civilization (e.g., Samuel Gridley Howe and Henry Maudsley; see The Invisible Plague by Edwin Fuller Torrey & Judy Miller) and I’d add the point that there appear to be some overlapping factors between schizophrenia and autism — besides gluten, some of the implicated factors are glutamate, exorphins, inflammation, etc. “It is unlikely,” writes William Davis, “that wheat exposure was the initial cause of autism or ADHD but, as with schizophrenia, wheat appears to be associated with worsening characteristics of the conditions” (Wheat Belly, p. 48).

For most of human history, crop failures and famine were a regular occurrence. And this most harshly affected the poor masses when grain and bread prices went up, leading to food riots and sometimes revolutions (e.g., French Revolution). Before the 1800s, grains were so expensive that, in order to make them affordable, breads were often adulterated with fillers or entirely replaced with grain substitutes, the latter referred to as “famine breads” and sometimes made with tree bark. Even when available, the average person might be spending most of their money on bread, as it was one of the most costly foods around and other foods weren’t always easily obtained.

Even so, grain being highly sought after certainly doesn’t imply that the average person was eating a high-carb diet, quite the opposite (A Common Diet). Food in general was expensive and scarce and, among grains, wheat was the least common. At times, this would have forced feudal peasants and later landless peasants onto a diet limited in both carbohydrates and calories, which would have meant a typically ketogenic state (Fasting, Calorie Restriction, and Ketosis), albeit far from an optimal way of achieving it. The further back in time one looks the greater prevalence would have been ketosis (e.g., Spartan  and Mongol diet), maybe with the exception of the ancient Egyptians (Ancient Atherosclerosis?). In places like Ireland, Russia, etc, the lower classes remained on this poverty diet that was often a starvation diet well into the mid-to-late 1800s, although in the case of the Irish it was an artificially constructed famine as the potato crop was essentially being stolen by the English and sold on the international market.

Yet, in America, the poor were fortunate in being able to rely on a meat-based diet because wild game was widely available and easily obtained, even in cities. That may have been true for many European populations as well during earlier feudalism, specifically prior to the peasants being restricted in hunting and trapping on the commons. This is demonstrated by how health improved after the fall of the Roman Empire (Malnourished Americans). During this earlier period, only the wealthy could afford high-quality bread and large amounts of grain-based foods in general. That meant highly refined and fluffy white bread that couldn’t easily be adulterated. Likewise, for the early centuries of colonialism, sugar was only available to the wealthy — in fact, it was a controlled substance typically only found in pharmacies. But for the elite who had access, sugary pastries and other starchy dessert foods became popular. White bread and pastries were status symbols. Sugar was so scarce that wealthy households kept it locked away so the servants couldn’t steal it. Even fruit was disproportionately eaten by the wealthy. A fruit pie would truly have been a luxury with all three above ingredients combined in a single delicacy.

Part of the context is that, although grain yields had been increasing during the early colonial era, there weren’t dependable surplus yields of grains before the 1800s. Until then, white bread, pastries, and such simply were not affordable to most people. Consumption of grains, along with other starchy carbs and sugar, rose with 19th century advancements in agriculture. Simultaneously, income was increasing and the middle class was growing. But even as yields increased, most of the created surplus grains went to feeding livestock, not to feeding the poor. Grains were perceived as cattle feed. Protein consumption increased more than did carbohydrate consumption, at least initially. The American population, in particular, didn’t see the development of a high-carb diet until much later, as related to US mass urbanization also happening later.

Coming to the end of the 19th century, there was the emergence of the mass diet of starchy and sugary foods, especially the spread of wheat farming and white bread. And, in the US, only by the 20th century did grain consumption finally surpass meat consumption. Following that, there has been growing rates of autism. Along with sensory sensitivity, autistics are well known for their pickiness about foods and well known for cravings for particular foods such as those made from highly refined wheat flour, from white bread to crackers. Yet the folktales in question were speaking to a still living memory of an earlier time when these changes had yet to happen. Hans Christian Andersen first published “The Princess and the Pea” in 1835, but such stories had been orally told long before that, probably going back at least centuries, although we now know that some of these folktales have their origins millennia earlier, even into the Bronze Age. According to the Wikipedia article on “The Princess and the Pea”,

“The theme of this fairy tale is a repeat of that of the medieval Perso-Arabic legend of al-Nadirah.[6] […] Tales of extreme sensitivity are infrequent in world culture but a few have been recorded. As early as the 1st century, Seneca the Younger had mentioned a legend about a Sybaris native who slept on a bed of roses and suffered due to one petal folding over.[23] The 11th-century Kathasaritsagara by Somadeva tells of a young man who claims to be especially fastidious about beds. After sleeping in a bed on top of seven mattresses and newly made with clean sheets, the young man rises in great pain. A crooked red mark is discovered on his body and upon investigation a hair is found on the bottom-most mattress of the bed.[5] An Italian tale called “The Most Sensitive Woman” tells of a woman whose foot is bandaged after a jasmine petal falls upon it.”

I would take it as telling that, in the case of this particular folktale, it doesn’t appear to be as ancient as other examples. That would support my argument that the sensory sensitivity of autism might be caused by greater consumption of refined wheat, something that only began to appear late in the Axial Age and only became common much later. Even for the few wealthy that did have access in ancient times, they were eating rather limited amounts of white bread. It might have required hitting a certain level of intake, not seen until modernity or closer to it, before the extreme autistic symptoms became noticeable among a larger number of the aristocracy and monarchy.

* * *

Sources

Others have connected such folktales of sensitivity with autism:

The high cost and elite status of grains, especially white bread, prior to 19th century high yields:

The Life of a Whole Grain Junkie
by Seema Chandra

Did you know where the term refined comes from? Around 1826, whole grain bread used by the military was called superior for health versus the white refined bread used by the aristocracy. Before the industrial revolution, it was more labor consuming and more expensive to refine bread, so white bread was the main staple loaf for aristocracy. That’s why it was called “refined”.

The War on White Bread
by Livia Gershon

Bread has always been political. For Romans, it helped define class; white bread was for aristocrats, while the darkest brown loaves were for the poor. Later, Jacobin radicals claimed white bread for the masses, while bread riots have been a perennial theme of populist uprisings. But the political meaning of the staff of life changed dramatically in the early twentieth-century United States, as Aaron Bobrow-Strain, who went on to write the book White Bread, explained in a 2007 paper. […]

Even before this industrialization of baking, white flour had had its critics, like cracker inventor William Sylvester Graham. Now, dietary experts warned that white bread was, in the words of one doctor, “so clean a meal worm can’t live on it for want of nourishment.” Or, as doctor and radio host P.L. Clark told his audience, “the whiter your bread, the sooner you’re dead.”

Nutrition and Economic Development in the Eighteenth-Century Habsburg Monarchy: An Anthropometric History
by John Komlos
p.31

Furthermore, one should not disregard the cultural context of food consumption. Habits may develop that prevent the attainment of a level of nutritional status commensurate with actual real income. For instance, the consumption of white bread or of polished rice, instead of whole-wheat bread or unpolished rice, might increase with income, but might detract from the body’s well-being. Insofar as cultural habits change gradually over time, significant lags could develop between income and nutritional status.

pp. 192-194

As consequence, per capita food consumption could have increased between 1660 and 1740 by as much as 50 percent. The fact that real wages were higher in the 1730s than at any time since 1537 indicates a high standard of living was reached. The increase in grain exports, from 2.8 million quintals in the first decade of the eighteenth century to 6 million by the 1740s, is also indicative of the availability of nutrients.

The remarkably good harvests were brought about by the favorable weather conditions of the 1730s. In England the first four decades of the eighteenth century were much warmer than the last decades of the previous century (Table 5.1). Even small differences in temperature may have important consequences for production. […] As a consequence of high yields the price of consumables declined by 14 percent in the 1730s relative to the 1720s. Wheat cost 30 percent less in the 1730s than it did in the 1660s. […] The increase in wheat consumption was particularly important because wheat was less susceptible to mold than rye. […]

There is direct evidence that the nutritional status of many populations was, indeed, improving in the early part of the eighteenth century, because human stature was generally increasing in Europe as well as in America (see Chapter 2). This is a strong indication that protein and caloric intake rose. In the British colonies of North America, an increase in food consumption—most importantly, of animal protein—in the beginning of the eighteenth century has been directly documented. Institutional menus also indicate that diets improved in terms of caloric content.

Changes in British income distribution conform to the above pattern. Low food prices meant that the bottom 40 percent of the distribution was gaining between 1688 and 1759, but by 1800 had declined again to the level of 1688. This trend is another indication that a substantial portion of the population that was at a nutritional disadvantage was doing better during the first half of the eighteenth century than it did earlier, but that the gains were not maintained throughout the century.

The Roots of Rural Capitalism: Western Massachusetts, 1780-1860
By Christopher Clark
p. 77

Livestock also served another role, as a kind of “regulator,” balancing the economy’s need for sufficiency and the problems of producing too much. In good years, when grain and hay were plentiful, surpluses could be directed to fattening cattle and hogs for slaughter, or for exports to Boston and other markets on the hoof. Butter and cheese production would also rise, for sale as well as for family consumption. In poorer crop years, however, with feedstuffs rarer, cattle and swine could be slaughtered in greater numbers for household and local consumption, or for export as dried meat.

p. 82

Increased crop and livestock production were linked. As grain supplies began to overtake local population increases, more corn in particular became available for animal feed. Together with hay, this provided sufficient feedstuffs for farmers in the older Valley towns to undertake winter cattle fattening on a regular basis, without such concern as they had once had for fluctuations in output near the margins of subsistence. Winter fattening for market became an established practice on more farms.

When Food Changed History: The French Revolution
by Lisa Bramen

But food played an even larger role in the French Revolution just a few years later. According to Cuisine and Culture: A History of Food and People, by Linda Civitello, two of the most essential elements of French cuisine, bread and salt, were at the heart of the conflict; bread, in particular, was tied up with the national identity. “Bread was considered a public service necessary to keep the people from rioting,” Civitello writes. “Bakers, therefore, were public servants, so the police controlled all aspects of bread production.”

If bread seems a trifling reason to riot, consider that it was far more than something to sop up bouillabaisse for nearly everyone but the aristocracy—it was the main component of the working Frenchman’s diet. According to Sylvia Neely’s A Concise History of the French Revolution, the average 18th-century worker spent half his daily wage on bread. But when the grain crops failed two years in a row, in 1788 and 1789, the price of bread shot up to 88 percent of his wages. Many blamed the ruling class for the resulting famine and economic upheaval.
Read more: https://www.smithsonianmag.com/arts-culture/when-food-changed-history-the-french-revolution-93598442/#veXc1rXUTkpXSiMR.99
Give the gift of Smithsonian magazine for only $12! http://bit.ly/1cGUiGv
Follow us: @SmithsonianMag on Twitter

What Brought on the French Revolution?
by H.A. Scott Trask

Through 1788 and into 1789 the gods seemed to be conspiring to bring on a popular revolution. A spring drought was followed by a devastating hail storm in July. Crops were ruined. There followed one of the coldest winters in French history. Grain prices skyrocketed. Even in the best of times, an artisan or factor might spend 40 percent of his income on bread. By the end of the year, 80 percent was not unusual. “It was the connection of anger with hunger that made the Revolution possible,” observed Schama. It was also envy that drove the Revolution to its violent excesses and destructive reform.

Take the Reveillon riots of April 1789. Reveillon was a successful Parisian wall-paper manufacturer. He was not a noble but a self-made man who had begun as an apprentice paper worker but now owned a factory that employed 400 well-paid operatives. He exported his finished products to England (no mean feat). The key to his success was technical innovation, machinery, the concentration of labor, and the integration of industrial processes, but for all these the artisans of his district saw him as a threat to their jobs. When he spoke out in favor of the deregulation of bread distribution at an electoral meeting, an angry crowded marched on his factory, wrecked it, and ransacked his home.

Why did our ancestors prefer white bread to wholegrains?
by Rachel Laudan

Only in the late nineteenth and twentieth century did large numbers of “our ancestors”–and obviously this depends on which part of the world they lived in–begin eating white bread. […]

Wheat bread was for the few. Wheat did not yield well (only seven or eight grains for one planted compared to corn that yielded dozens) and is fairly tricky to grow.

White puffy wheat bread was for even fewer. Whiteness was achieved by sieving out the skin of the grain (bran) and the germ (the bit that feeds the new plant). In a world of scarcity, this made wheat bread pricey. And puffy, well, that takes fairly skilled baking plus either yeast from beer or the kind of climate that sourdough does well in. […]

Between 1850 and 1950, the price of wheat bread, even white wheat bread, plummeted in price as a result of the opening up of new farms in the US and Canada, Argentina, Australia and other places, the mechanization of plowing and harvesting, the introduction of huge new flour mills, and the development of continuous flow bakeries.

In 1800 only half the British population could afford wheat bread. In 1900 everybody could.

History of bread – Industrial age
The Industrial Age (1700 – 1887)
from The Federation of Bakers

In Georgian times the introduction of sieves made of Chinese silk helped to produce finer, whiter flour and white bread gradually became more widespread. […]

1757
A report accused bakers of adulterating bread by using alum lime, chalk and powdered bones to keep it very white. Parliament banned alum and all other additives in bread but some bakers ignored the ban. […]

1815
The Corn Laws were passed to protect British wheat growers. The duty on imported wheat was raised and price controls on bread lifted. Bread prices rose sharply. […]

1826
Wholemeal bread, eaten by the military, was recommended as being healthier than the white bread eaten by the aristocracy.

1834
Rollermills were invented in Switzerland. Whereas stonegrinding crushed the grain, distributing the vitamins and nutrients evenly, the rollermill broke open the wheat berry and allowed easy separation of the wheat germ and bran. This process greatly eased the production of white flour but it was not until the 1870s that it became economic. Steel rollermills gradually replaced the old windmills and watermills.

1846
With large groups of the population near to starvation the Corn Laws were repealed and the duty on imported grain was removed. Importing good quality North American wheat enabled white bread to be made at a reasonable cost. Together with the introduction of the rollermill this led to the increase in the general consumption of white bread – for so long the privilege of the upper classes.

Of all foods bread is the most noble: Carl von Linné (Carl Linneaus) on bread
by Leena Räsänen

In many contexts Linné explained how people with different standing in society eat different types of bread. He wrote, “Wheat bread, the most excellent of all, is used only by high-class people”, whereas “barley bread is used by our peasants” and “oat bread is common among the poor”. He made a remark that “the upper classes use milk instead of water in the dough, as they wish to have a whiter and better bread, which thereby acquires a more pleasant taste”. He compared his own knowledge on the food habits of Swedish society with those mentioned in classical literature. Thus, according to Linné, Juvenal wrote that “a soft and snow-white bread of the finest wheat is given to the master”, while Galen condemned oat bread as suitable only for cattle, not for humans. Here Linné had to admit that it is, however, consumed in certain provinces in Sweden.

Linné was aware of and discussed the consequences of consuming less tasty and less satisfying bread, but he seems to have accepted as a fact that people belonging to different social classes should use different foods to satisfy their hunger. For example, he commented that “bran is more difficult to digest than flour, except for hard-labouring peasants and the likes, who are scarcely troubled by it”. The necessity of having to eat filling but less palatable bread was inevitable, but could be even positive from the nutritional point of view. “In Östergötland they mix the grain with flour made from peas and in Scania with vetch, so that the bread may be more nutritious for the hard-working peasants, but at the same time it becomes less flavoursome, drier and less pleasing to the palate.” And, “Soft bread is used mainly by the aristocracy and the rich, but it weakens the gums and teeth, which get too little exercise in chewing. However, the peasant folk who eat hard bread cakes generally have stronger teeth and firmer gums”.

It is intriguing that Linné did not find it necessary to discuss the consumption or effect on health of other bakery products, such as the sweet cakes, tarts, pies and biscuits served by the fashion-conscious upper class and the most prosperous bourgeois. Several cookery books with recipes for the fashionable pastry products were published in Sweden in the eighteenth century 14. The most famous of these, Hjelpreda i Hushållningen för Unga Fruentimmer by Kajsa Warg, published in 1755, included many recipes for sweet pastries 15. Linné mentioned only in passing that the addition of egg makes the bread moist and crumbly, and sugar and currants impart a good flavour.

The sweet and decorated pastries were usually consumed with wine or with the new exotic beverages, tea and coffee. It is probable that Linné regarded pastries as unnecessary luxuries, since expensive imported ingredients, sugar and spices, were indispensable in their preparation. […]

Linné emphasized that soft and fresh bread does not draw in as much saliva and thus remains undigested for a long time, “like a stone in the stomach”. He strongly warned against eating warm bread with butter. While it was “considered as a delicacy, there was scarcely another food that was more damaging for the stomach and teeth, for they were loosen’d by it and fell out”. By way of illustration he told an example reported by a doctor who lived in a town near Amsterdam. Most of the inhabitants of this town were bakers, who sold bread daily to the residents of Amsterdam and had the practice of attracting customers with oven-warm bread, sliced and spread with butter. According to Linné, this particular doctor was not surprised when most of the residents of this town “suffered from bad stomach, poor digestion, flatulence, hysterical afflictions and 600 other problems”. […]

Linné was not the first in Sweden to write about famine bread. Among his remaining papers in London there are copies from two official documents from 1696 concerning the crop failure in the northern parts of Sweden and the possibility of preparing flour from different roots, and an anonymous small paper which contained descriptions of 21 plants, the roots or leaves of which could be used for flour 10. These texts had obviously been studied by Linné with interest.

When writing about substitute breads, Linné formulated his aim as the following: “It will teach the poor peasant to bake bread with little or no grain in the circumstance of crop failure without destroying the body and health with unnatural foods, as often happens in the countryside in years of hardship” 10.

Linné’s idea for a publication on bread substitutes probably originated during his early journeys to Lapland and Dalarna, where grain substitutes were a necessity even in good years. Actually, bark bread was eaten in northern Sweden until the late nineteenth century 4. In the poorest regions of eastern and north-eastern Finland it was still consumed in the 1920s 26. […]

Bark bread has been used in the subarctic area since prehistoric times 4. According to Linné, no other bread was such a common famine bread. He described how in springtime the soft inner layer can be removed from debarked pine trees, cleaned of any remaining bark, roasted or soaked to remove the resin, and dried and ground into flour. Linné had obviously eaten bark bread, since he could say that “it tastes rather well, is however more bitter than other bread”. His view of bark bread was most positive but perhaps unrealistic: “People not only sustain themselves on this, but also often become corpulent of it, indeed long for it.” Linné’s high regard for bark bread was shared by many of his contemporaries, but not all. For example, Pehr Adrian Gadd, the first professor of chemistry in Turku (Åbo) Academy and one of the most prominent utilitarians in Finland, condemned bark bread as “useless, if not harmful to use” 28. In Sweden, Anders Johan Retzius, a professor in Lund and an expert on the economic and pharmacological potential of Swedish flora, called bark bread “a paltry food, with which they can hardly survive and of which they always after some time get a swollen body, pale and bluish skin, big and hard stomach, constipation and finally dropsy, which ends the misery” 4. […]

Linné’s investigations of substitutes for grain became of practical service when a failed harvest of the previous summer was followed by famine in 1757 10. Linné sent a memorandum to King Adolf Fredrik in the spring of 1757 and pointed out the risk to the health of the hungry people when they ignorantly chose unsuitable plants as a substitute for grain. He included a short paper on the indigenous plants which in the shortage of grain could be used in bread-making and other cooking. His Majesty immediately permitted this leaflet to be printed at public expense and distributed throughout the country 10. Soon Linné’s recipes using wild flora were read out in churches across Sweden. In Berättelse om The inhemska wäxter, som i brist af Säd kunna anwändas til Bröd- och Matredning, Linné 32 described the habitats and the popular names of about 30 edible wild plants, eight of which were recommended for bread-making.

* * *

I’ll just drop a couple videos here for general info:

On Democracy and Corporatocracy

“Whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness.”
~U.S. Declaration of Independence

“Democracy was once a word of the people, a critical word, a revolutionary word. It has been stolen by those who would rule over the people, to add legitimacy to their rule…The basic idea of democracy is simple . . .
“Democracy is a word that joins demos—the people—with kratia—power . . . It describes an ideal, not a method for achieving it. It is not a kind of government, but an end of government; not a historically existing institution, but a historical project . . . if people take it up as such and struggle for it.”
~Douglas Lummis, Radical Democracy

“It is that right to local self-government – a right that we’re told that we already have, but which people discover is not there when they need it most – that serves as the guide-star of this slowly gathering movement.
“To stop them, corporate and governmental officials will be forced to slay their own sacred cow – the ‘rule of law’ – which they have used since time immemorial as their own version of ‘God said so’” Thus, governmental and corporate officials will be forced to bring the power of the system’s own courts, legislatures, and regulators crashing down on them, in the face of clear and overwhelming evidence that our food and water systems, our energy systems, and our global climate are themselves crashing as a result of policies created by those very same institutions…
“These communities’ new rule of law – made in the name of environmental and economic sanity – believes that people and nature have rights, not corporations; that new civil, political, and environmental rights must be recognized; and that we must stop (immediately) those corporate acts which harm us.”
~Thomas Linzy, Local Lawmaking: A Call for a Community Rights Movement

“The main mark of modern governments is that we do not know who governs, de facto any more than de jure. We see the politician and not his backer; still less the backer of the backer; or what is most important of all, the banker of the backer. Enthroned above all, in a manner without parallel in all past, is the veiled prophet of finance, swaying all men living by a sort of magic, and delivering oracles in a language not understood of the people.”
~J.R.R. Tolkien, quoted in Contour magazine

“The large extent of bank influence is not easily seen. We seldom see an identified bank or a money corporation candidate running for office; but when questions arise which affect them, the banks have agents at work, whose operations are the more effective because they are unseen.”
~William M. Gouge, Advisor to President Andrew Jackson, Editor fot the Philadelphia Gazette, Publisher of the “History of the American Banking System” and a “Fiscal History of Texas”

“Civil government, so far as it is instituted for the security of property, is in reality instituted for the defense of the rich against the poor, or of those who have some property against those who have none at all.”
~Adam Smith, 1776, Wealth of Nations, book V, ch.I, part II

“[T]he basic problem of legal thinkers after the Civil War was how to articulate a conception of property that could accommodate the tremendous expansion in the variety of forms of ownership spawned by a dynamic industrial society…The efforts by legal thinkers to legitimate the business corporation during the 1890’s were buttressed by a stunning reversal in American economic thought – a movement to defend and justify as inevitable the emergence of large-scale corporate concentration.”
~Morton Horwitz, The Transformation of American Law

“What did he [Bingham] think about the conversion of the Fourteenth Amendment from a protection of all constitutional rights for all citizens to a bulwark of corporate power against the protests of farmers and workers? Here we have a bit more information. Bingham later wrote that the amendment had been designed to protect natural persons, not corporations.
“That seems quite reasonable, particularly since the first sentence of Section one refers to persons ‘born or naturalized in the United States.’”
~Michael Kent Curtis, John A. Bingham and the Story of American Liberty: The Lost Cause Meets the ‘Lost Cause’, The Akron Law Review
(John Bingham was a Republican Congressman from Ohio and principal framer of the 14th Amendment to the U.S. Constitution which granted due process and equal protection under the law to freed slaves.)

“I think we would agree to describe the reality that flows from this corporate power as anti-democratic, anti-community, anti-worker, anti-person and anti-planet…Given our relative consensus on this situation, what should we be asking and doing about the corporation?…To effectively begin the work of countering what amounts to global corporate tyranny, we’ll need to do two kinds of defining: what we wish to see in the future, and what we are seeing in the present…We’ll never move these corporate behemoths out of our way with the poking sticks and thin willow reeds available to us through regulatory action…Nor will we gain their everlasting mercy with pleas for social responsibility or requests to sign a corporate ‘code of conduct,’ or the pitiful pleading for side agreements on free-trade pacts…Our colonized minds make it difficult to cut through our experience and envision real democracy. We’ve got a ‘cop in our head,’ and the cop comes from corporate headquarters…What must be done?
“When those of us who believe in an empowered citizenship see corporations spewing excrement and oppression with ever greater reach, we need to ask, ‘By what authority can corporations do that? They have no authority to do that. We never gave them authority.’ And we must work strategically to challenge their claims to authority…”
~Virginia Rasmussen, “Rethinking the Corporation”, Program on Corporations, Law & Democracy (POCLAD) principal, talk given during Women’s International League for Peace and Freedom conference, July 24-31, Baltimore, MD

Source – REAL Democracy History Calendar: July 22 – 28, July 15 – 21, & July 8 – 14