To Be Fat And Have Bread

The obsession with body fat is an interesting story. It didn’t begin a few generations ago but goes back centuries. But maybe that shouldn’t be surprising.

That was the colonial era when the diet was transformed by imperial trade of foreign foods. I might note that this included previously rare or never before seen varieties of fattening carbohydrates: sugar, potatoes, corn, rice, etc. The old feudal system was ending and entirely different forms of food production and diets were developing, especially for the then landless peasants. Hunting, gathering and grazing for the commoners definitely would have been on the decline for a while at that point, as the last of the commons had been privatized. The loss of access to wild game would take longer in the colonies, but eventually it happened everywhere.

The last stage of that shift overlapped with the beginnings of industrialization and agricultural improvements. In the 19th century, change in wheat surpluses and hence costs and prices. Agriculture boomed as fewer people were employed in it. There was also a sudden obsession with gender roles and social roles in general, such as the post-revolutionary expectation of the mother to make citizens out of her children. Bread-making, a once uncommon activity for Americans, became increasingly important to the normative identity of family life and the symbolic maintenance of the social order.

Regular consumption of wheat bread was once limited to the wealthy and that is how refined bread gained its moral association with the refined class. Only the wealthy could afford wheat prior to the 19th century, as prior to that the poor were forced to rely upon cheaper grains and grain substitutes at a time when bread was regularly adulterated with bark, sawdust, chalk, etc. Poverty breads, in the previous centuries, often were made with no grain at all.* For wheat and especially heavily refined white bread to become available to all walks of life meant an upsurge of the civilizing process. The obsession with middle class life took hold and so cookbooks were produced in large numbers.

In a growing reactionary impulse, there was a nostalgic tendency toward invented traditions. Bread took on new meanings that then were projected onto the past. It wasn’t acknowledged how radical was the industrial agriculture and industrial milling that made all of this possible. And the disconnection is demonstrated by the simultaneous promotion of the grain production of this industrial age and the complaint about how industrialized life was destroying all that was good. Bread, as a symbol, transcended these mere details.

With the aristocracy having been challenged during the Revolutionary Era the refinement of the refined class that once was admired had then become suspect. The ideology of whole foods began to emerge and had some strong proponents. But by the end of the 1800s, the ideal of refinement gained prominence again and prepared the way for the following century of ever greater industrialization of processed foods. Refinement represented progress. Only after more extensive refinement led to mass malnourishment, near the end of that century and heading into the next, did whole foods once again capture the public imagination.

Then we enter the true era of fat obsession, fat blaming, and dieting, endless dieting. Eat your whole grains, get your fiber, make sure you get enough servings of fruits, and veggies, and don’t forget to exercise. Calories in, calories out. Count your calories, count your carbs, count your steps. Count every last one of them. Still, the basic sides of the debate remain the same: fewer carbohydrates vs less meat, whole foods vs refined foods, barbaric lifestyle vs civilizing process, individual moral failure vs societal changes, etc. One theme that runs through dietary advice from the ancient world to the present is that there is a close link between physical health, mental health, and moral health — the latter erupting as moral panic and moral hygiene. But what stands about the modern era, beginning in the 1600s, is that it was observed that psychological problems were mostly seen among the well-to-do.

This was often blamed on luxury and sometimes on meat (a complaint often about animals raised unnaturally in confinement and probably fed grain, the early equivalent of concerns about factory farming; but also a complaint about the introduction of foreign spices and use of fancy sauces to make meat more appetizing), although there was beginning to be an awareness that a high-carb diet might be playing a role in that it was often noted that the morbidly obese ate lots of pastries, fruit pies, and such. The poor didn’t have much access to wheat and sugar before the 1800s, but the wealthy had plenty of such foods centuries earlier. Meat consumption didn’t change much during that era of colonial trade. What did change the most was availability of starchy and sugary foods, and the wealthy consumed them in great proportions. Meat had always been a desirable food going back to earliest hominid evolution. Modern agriculture and global trade, however, entirely transformed the human diet with the introduction of massive amounts of carbohydrates.

It’s strange that right from the beginning of the modern era there were those pushing for a vegetarian diet, not many but their voices were being heard for the first time. Or maybe it wasn’t so strange. Prior to the modern era, a vegetarian diet so far north in Europe would have been impossible. It was only the elite promoting vegetarianism as only they could afford a vegetarian diet year round, in buying expensive plant-based foods that were often shipped in from far away. Although plant foods were expensive at the time, they were available to those who had plenty of money. But during the Middle Ages and earlier, vegetarianism for the most part was not an option for anyone since the food items required of such a diet simply weren’t available enough to sustain life, certainly not in places like England or Germany.

There is another side to this bring us back to the obsession with fat. It was only with the gradual increase of grain production that cattle could be fed grain, not only as additional feed in the winter but year round. This is also what allowed the possibility of confining animals, rather than grazing them on fields. Grain surpluses weren’t consistent until the 19th century, but even before that grain production had been increasing. There were slow improvements in agriculture over the centuries. The rich could afford meat from grain-fed animals much earlier than the rest of the population and it was highly sought after. That is because such meat is extremely fatty creating those beautiful marbled steaks, pork chops, etc (such fattiness, by the way, is a sign of metabolic syndrome in both animals and humans). Fat couldn’t have been a focus of debate prior to grain-fattened animals became common.

So, there is a reason that both wheat bread and fatty meat gained immense symbolic potency at the same time. Similarly, it was during this same era that vegetables became more common and gardens likewise became symbols of wealth, abundance, and the good life. Only the rich could afford to maintain large gardens because of the difficulty involved and immense time-consuming work required (see The Jane Austen Diet by Bryan Kozlowski**; also about the American diet before the 20th century, see The Big Fat Surprise by Nina Teicholz that I quote in Malnourished Americans). They represented the changed diet of modern civilization. They were either indicators of progress or decline, depending on one’s perspective. Prior to modernity, a diet had consisted to a much greater degree of foods that were gathered, hunted, trapped, and fished.

The shift from one source of food to another changed the diet and so changed the debate about diet. There suddenly were more options of foods available as choices to argue about. Diet as a concept was being more fully formulated. Rather than being something inherited according to the traditional constraints of local food systems and food customs, assuming one had the wealth, one could pick from a variety of possible diets. Even to this day, the obsession about dieting carries a taint of class privilege. It is, as they say, a first world problem. But what is fascinating is how this way of thinking took hold in the 1600s and 1700s. There was a modern revolution in dietary thought in the generations before modern political revolution. The old order was falling apart and sometimes actively being dismantled. This created much anxiety and it forced the individual into a state of uncertainty. Old wisdom no longer could be relied upon.

* * *

*Rather than bread, the food that was most associated with the laboring class was fish, a food the wealthy avoided. Think about how lobster and clams used to be poverty foods. In Galenic theory of humoral physiology, fish is considered cold and wet, hard to digest and weakening. This same humoral category of food also included fruits and vegetables. This might be why, even to this day, many vegetarians and vegans will make an exception for fish, in seeing it as different than ‘meat’. This is an old ideological bias because ‘meat’ was believed to have the complete opposite effect of being hot and dry, easy to digest and invigorating. This is the reason for why meat but not fish was often banned during religious fasts and festivals.

As an interesting side note, the supposed cooling effect of fish was a reason for not eating it during the cold times of the year. Fish is one of the highest sources of vitamin A. Another source is by way of the precursor of beta-carotene found in vegetables. That these two types of food are considered of the same variety according to Galenic thought is interesting. Cold weather is one of the factors that can disrupt the body’s ability to convert beta-carotene into usable vitamin A. The idea of humors mixes this up slightly, but it maybe points to understanding there was something important to be understood. Eating more meat, rather than vegetables, in winter is a wise practice in a traditional society that can’t supplement such nutrients. Vitamin A is key for maintaining a strong immune system and handling stress (True Vitamin A For Health And Happiness).

By the way, it was during the 19th century that a discussion finally arose about vegetarianism. The question was about whether life and health could be sustained with vegetables. Then again, those involved were probably still being influenced by Galenic thought. By vegetarianism, they likely meant a more general plant-based diet that excluded ‘meat’ but not necessarily fish. The context of the debate was the religious abstinence of Lent, during which fish was allowed. So, maybe the fundamental argument was more about the possibility of long-term survival solely on moist, cooling foods. Whatever the exact point of contention, it was the first time in the modern Western world where a plant-based diet (be it vegan, vegetarian, or pescetarian-style Mediterranean diet) was considered seriously.

These ideas have been inherited by us, even though the philosophical justifications no longer make sense to us. This is seen in the debate that continues over red meat in particular and meat in general, specifically in terms of the originally Galenic assertion of its heat and dryness building up the ‘blood’ (High vs Low Protein). It’s funny that dietary debates remain obsessed over red meat (along with the related issue of cows and their farts), even though actual consumption of red meat has declined over the past century. As with bread, the symbolic value of red meat has maybe even gained greater importance. Similarly, as I mentioned above, the uncertain categorization of fish remains hazy. I know a vegan who doesn’t eat ‘meat’ but does eat fish. When I noted how odd that was, a vegetarian I was talking to thought it made perfect sense. This is Galenic thought without the Galenic theory that at least made it a rational position, but the ideological bias remains in spite of those adhering to it being unable to explain why they hold that bias. It amuses me.

Ideologies are powerful systems. They are mind viruses that can survive and mutate across centuries and sometimes millennia. Most of the time, their origins are lost to history. But sometimes we are able to trace them and it makes for strange material to study.

See: “Fish in Renaissance Dietary Theory” by Ken Albala from Fish: Food from the Waters ed. by Harlan Walker, and Food and Faith in Christian Culture ed. by Ken Albala and Trudy Eden. Also, read text below, such as the discussion of vegetarianism.

* * *

(Both texts below are from collections that are freely available on Google Books and possibly elsewhere.)

The Fat of the Land: Proceedings of the Oxford Symposium on Food and Cooking 2002
ed. by Harlan Walker
“The Apparition of Fat in Western Nutritional Theory”
by Ken Albala

Naturally dietary systems of the past had different goals in mind when framing their recommendations. They had different conceptions of the good, and at some point in history that came to include not being fat. Body size then became an official concern for dietary writers. Whether the original impetus for this change was a matter of fashion, spirituality or has its roots in a different approach to science is impossible to say with any degree of precision. But this paper will argue that nutritional science itself as reformulated in the 17th century was largely to blame for the introduction of fat into the discourse about how health should be defined. […] Obesity is a pathological state according to modern nutritional science. But it was not always so.

When and why fat became a medical issue has been a topic of concern among contemporary scholars. Some studies, such as Peter N. Sterns’ Fat History: Bodies and Beauty in the Modern West, place the origin of our modern obsession in the late 19th century when the rise of nutritional science and health movements lead by figures like John Harvey Kellogg, hand in hand with modern advertising and Gibson Girls, swept away the Victorian preference for fulsome figures. As a form of social protest, those who could afford to, much as in the 60s, idealized the slim androgynous figure we associate with flappers. Others push the origin further back into the early 19th century, in the age of Muscular Christianity and Sylvester Graham. But clearly the obsession is earlier than this. In the 18th century the 448 pound physician George Cheyne and his miracle dieting had people flocking to try out the latest ‘cures.’ It was at the same time that dissertations on the topic of obesity became popular, and clearly the medical profession had classified this as a treatable condition. And readers had already been trained to monitor and police their own bodies for signs of impending corpulence. The roots of this fear and guilt must lie somewhere in the previous century as nutritional science was still groping its way through a myriad of chemical and mechanical theories attempting to quantify health and nutrition with empirical research.

The 17th century is also the ideal place to look if only because the earlier system of humoral physiology is almost totally devoid of a concept of fat as a sickness. […]

For all authors in the Galenic tradition it appears that fat was seen as a natural consequence of a complexion tending to the cold and moist, something which could be corrected, but not considered an illness that demanded serious attention. And socially there does not seem to have been any specific stigma attached to fat if Rubens’ taste in flesh is any measure.

The issue of fat really only emerges among authors who have abandoned, in part or totally, the system of humoral physiology. This seems to have something to do with both the new attempts to quantify nutrition, first and most famously by Santorio Santorio9 and also among those who began to see digestion and nutrition as chemical reactions which when gone awry cast fatty deposits throughout the body. It was only then that fat came to be considered a kind of sickness to be treated with therapy.10

The earliest indications that fat was beginning to be seen as a medical problem are found in the work of the first dietary writer who systematically weighed himself. Although Santorio does not seem to have been anxious about being overweight himself, he did consistently define health as the maintenance of body weight. Expanding on the rather vague concept of insensible perspiration used by Galenic authors, Santorio sought to precisely measure the amount of food he consumed each day compared to the amount excreted in ‘sensible’ evacuations. […] Still, fat was not a matter of eating too much. ‘He who eats more than he can digest, is nourished less than he ought to be, and [becomes] consequently emaciated.’12 More importantly, fat was a sign of a system in disarray. […]

Food was not in fact the only factor Santorio or his followers took into account though. As before, the amount of exercise one gets, baths, air quality, even emotions could alter the metabolic rate. But now, the effect of all these could be precisely calculated. […]

At the same time that these mechanistic conceptions of nutrition became mainstream, a chemical understanding of how food is broken down by means of acids and alkalis also came to be accepted by the medical profession. These ideas ultimately harked back to Paracelsus writing in the 16th century but were elaborated upon by 17th century writers […] It is clear that by the early 18th century fat could be seen as a physiological defect that could be corrected by heating the body to facilitate digestive fermentation and the passage of insensible perspiration. […] Although the theories themselves are obviously nothing like our own, we are much closer to the idea of fat as a medical condition. […]

Where Cheyne departs from conventional medical opinion, is in his recommendation of a cooked vegetable diet to counter the affects of a disordered system, which he admits is rooted in his own ‘experience and observation on my own crazy carcase and the infirmities of others I have treated’ rather than on any theoretical foundation.

The controversy over whether vegetables could be considered a proper diet, not only for the sick or overgrown but for healthy individuals, was of great concern in the 18th century. Nicholas Andry in his Traité des alimens de caresme offered an extended diatribe against the very notion that vegetables could sustain life, a question of particular importance in Catholic France where Lenten restriction were still in force, at least officially. […] According to current medical theory, vegetables could not be suitable for weight loss, despite the successful results of the empirics. […]

It is clear that authors had a number of potentially conflicting theoretical models to draw from and both mechanical and chemical explanations could be used to explain why fat accumulates in the body. Yet with entirely different conceptual tools, these authors arrived at dietary goals surprisingly like our own, and equally as contentious. The ultimate goals now became avoiding disease and fat, and living a long life. While it would be difficult to prove that these dietary authors had any major impact beyond the wealthy elites and professionals who read their works, it is clear that a concern over fat was firmly in place by the mid 18th century, and appears to have its roots in a new conception of physiology which not only paid close attention to body weight as an index of health, but increasingly saw fat as a medical condition.

Food and Morality: Proceedings of the Oxford Symposium on Food and Cookery 2007
ed. by Susan R. Friedland
“Moral Fiber: Bread in Nineteenth-Century America”

by Mark McWilliams

From Sarah Josepha Hale, who claimed, ‘the more perfect the bread, the more perfect the lady’ to Sylvester Graham, who insisted, ‘the wife, the mother only’ has the ‘moral sensibility’ required to bake good bread for her family, bread often became a gendered moral marker in nineteenth-century American culture.1 Of course, what Hale and Graham considered ‘good’ bread differed dramatically, and exactly what constituted ‘good’ bread was much contested. Amidst technological change that made white flour more widely available and home cooking more predictable, bread, described in increasingly explicit moral terms, became the leading symbol of a housewife’s care for her family.

Americans were hardly the first to ascribe moral meaning to their daily bread. As Bernard Dupaigne writes, ‘since time immemorial [bread] has attended the great events of various human communities: monsoon or grape harvest bread, the blessed bread of Catholics or the unleavened bread of Passover, or the fasting-break bread of Ramadan. There is no bread that does not, somewhere in the world, celebrate an agricultural or religious holiday, enrich a family event, or commemorate the dead.’2 With such varied symbolic resonance, bread seems easily filled with new meanings.

In America (as later in France),3 bread became a revolutionary symbol. To the early English colonists’ dismay, European wheat did not adapt well to the North American climate; the shift to corn as the primary grain was perhaps the most important dietary adaptation made by the colonists. Wheat remained too expensive for common consumption well into the nineteenth century. […]

By the end of the Revolution, then, bread was already charged with moral meaning in the young United States. In the nineteenth century, this meaning shifted in response to agricultural improvements that made wheat more widely available, technological change that made bread easier to make consistently, and, perhaps most important, social change that made good bread the primary symbol of a housewife’s care for her family. In effect, bread suffered a kind of identity crisis that paralleled the national identity crisis of Jacksonian America. As Americans thought seriously about who they were in this new nation, about how they should act and even how they should eat, bread’s symbolic meaning – and bread itself– changed.

American agricultural production exploded, although the proportion of the population working on farms declined. James Trager notes that even before the McCormick reaper first sold in large numbers as farmers struggled to replace workers leaving for the 1849 Gold Rush, the average time required to produce a bushel of wheat declined 22 per cent from 1831 to 1840.7 Dramatic improvements in efficiency led to larger yields; for example, wheat production more than doubled between 1840 and 1860. Such increases in wheat production, combined with better milling procedures, made white flour finally available in quantities sufficient for white bread to become more than a luxury good.8

Even as wheat became easier to find for many Americans, bread remained notoriously difficult to make, or at least to make well. Lydia Maria Child, a baker’s daughter who became one of America’s leading writers, emphasizes what must have been the intensely frustrating difficulty of learning to cook in the era before predictable heat sources, standardized measurements, and consistent ingredients.9 […]

Unlike Hale, who implies that learning to bake better can be a kind of self improvement, this passage works more as dire warning to those not yet making the proper daily bread. Though bread becomes the main distinction between the civilized and the savage, Beecher turns quickly, and reassuringly, to the science of her day: ‘By lightness is meant simply that in order to facilitate digestion the particles are to be separated from each other by little holes or air-cells; and all the different methods of making light bread are neither more nor less than the formation of bread with these air cells’ (170). She then carefully describes how to produce the desired lightness in bread, instructions which must have been welcome to the young housewife now fully convinced of her bread’s moral importance.

The path for Beecher, Hale, and others had been prepared by Sylvester Graham, although he is little mentioned in their work.14 In his campaign to improve bread, Graham’s rhetoric ‘romanticized the life of the traditional household’ in ways that ‘unknowingly helped prepare women to find a new role as guardians of domestic virtue,’ as Stephen Nissenbaum notes.15 Bread was only one aspect of Graham’s program to educate Americans on what he called ‘the Science of Human Life.’ Believing on the one hand, unlike many at the time, that overstimulation caused debility and, on the other, that industrialization and commercialization were debasing modern life, Graham proposed a lifestyle based around a strict controls on diet and sexuality.16 While Graham promoted a range of activities from vegetarianism to temperance, his emphasis on good bread was most influential. […]

And yet modern conditions make such bread difficult to produce. Each stage of the process is corrupted, according to Graham. Rather than grow wheat in ‘a pure virgin soil’ required for the best grain, farmers employ fields ‘exhausted by tillage, and debauched by the means which man uses to enrich and stimulate it.’ As Nissenbaum notes, the ‘conscious sexual connotations’ of Graham’s language here is typical of his larger system, but the language also begins to point to the moral dimensions of good bread (6).

Similarly loaded language marks Graham’s condemnation of bakery bread. Graham echoed the common complaints about adulteration by commercial bakers. But he added a unique twist: even the best bakery bread was doubly flawed. The flour itself was inferior because it was over-processed, according to Graham: the ‘superfine flour’ required for white bread ‘is always far less wholesome, in any and every situation of life, than that which is made of wheaten meal which contains all the natural properties of the grain.’ […]

As Nissenbaum argues, pointing to this passage, Graham’s claims invoke ‘the vision of a domestic idyll, of a mother nursing her family with bread and affection’ (8). Such a vision clearly anticipates the emphasis on cookery as measure of a woman’s social worth in the domestic rhetoric that came so to characterize the mid-nineteenth century.

Such language increasingly linking cookery with morality emphasized the virtue not of the food itself but rather of the cooks preparing it. This linkage reached read ers not only through the explosion of cookbooks and domestic manuals but also through the growing numbers of sentimental novels. Indeed, this linkage provided a tremendously useful trope for authors seeking a shorthand to define their fictional characters. And that trope, in turn, helped expand the popularity of interpreting cookery in moral terms. […]

After the Civil War, domestic rhetoric evolved away from its roots in the wholesome foods of the nation’s past toward the ever-more refined cuisine of the Gilded Age. Graham’s refusal to evolve in this direction – his system was based entirely in a nostalgic struggle against modernity, against refinement – may well be a large part of why his work was quickly left behind even by those for whom it had paved the way.

* * *

Here is another text I came across. It’s not free, but it seems like a good survey worth buying.

 

 

Autism and the Upper Crust

There are multiple folktales about the tender senses of royalty, aristocrats, and other elite. The most well known example is “The Princess and the Pea”. In the Aarne-Thompson-Uther system of folktale categorization, it gets listed as type 704 about the search for a sensitive wife. That isn’t to say that all the narrative variants of elite sensitivity involve potential wives. Anyway, the man who made this particular story famous is Hans Christian Andersen, having published his translation in 1835. He longed to be a part of the respectable class, but felt excluded. Some speculate that he projected his own class issues onto his slightly altered version of the folktale, something discussed in the Wikipedia article about the story:

“Wullschlager observes that in “The Princess and the Pea” Andersen blended his childhood memories of a primitive world of violence, death and inexorable fate, with his social climber’s private romance about the serene, secure and cultivated Danish bourgeoisie, which did not quite accept him as one of their own. Researcher Jack Zipes said that Andersen, during his lifetime, “was obliged to act as a dominated subject within the dominant social circles despite his fame and recognition as a writer”; Andersen therefore developed a feared and loved view of the aristocracy. Others have said that Andersen constantly felt as though he did not belong, and longed to be a part of the upper class.[11] The nervousness and humiliations Andersen suffered in the presence of the bourgeoisie were mythologized by the storyteller in the tale of “The Princess and the Pea”, with Andersen himself the morbidly sensitive princess who can feel a pea through 20 mattresses.[12]Maria Tatar notes that, unlike the folk heroine of his source material for the story, Andersen’s princess has no need to resort to deceit to establish her identity; her sensitivity is enough to validate her nobility. For Andersen, she indicates, “true” nobility derived not from an individual’s birth but from their sensitivity. Andersen’s insistence upon sensitivity as the exclusive privilege of nobility challenges modern notions about character and social worth. The princess’s sensitivity, however, may be a metaphor for her depth of feeling and compassion.[1] […] Researcher Jack Zipes notes that the tale is told tongue-in-cheek, with Andersen poking fun at the “curious and ridiculous” measures taken by the nobility to establish the value of bloodlines. He also notes that the author makes a case for sensitivity being the decisive factor in determining royal authenticity and that Andersen “never tired of glorifying the sensitive nature of an elite class of people”.[15]

Even if that is true, there is more going on here than some guy working out his personal issues through fiction. This princess’ sensory sensitivity sounds like autism spectrum disorder and I have a theory about that. Autism has been associated with certain foods like wheat, specifically refined flour in highly processed foods (The Agricultural Mind). And a high-carb diet in general causes numerous neurocognitive problems (Ketogenic Diet and Neurocognitive Health), along with other health conditions such as metabolic syndrome (Dietary Dogma: Tested and Failed) and insulin resistance (Coping Mechanisms of Health), atherosclerosis (Ancient Atherosclerosis?) and scurvy (Sailors’ Rations, a High-Carb Diet) — by the way, the rates of these diseases have been increasing over the generations and often first appearing among the affluent. Sure, grains have long been part of the diet, but the one grain that had most been associated with the wealthy going back millennia was wheat, as it was harder to grow which caused it to be in short supply and so expensive. Indeed, it is wheat, not the other grains, that gets brought up in relation to autism. This is largely because of gluten, though other things have been pointed to.

It is relevant that the historical period in which these stories were written down was around when the first large grain surpluses were becoming common and so bread, white bread most of all, became a greater part of the diet. But as part of the diet, this was first seen among the upper classes. It’s too bad we don’t have cross-generational data on autism rates in terms of demographic and dietary breakdown, but it is interesting to note that the mental health condition neurasthenia, also involving sensitivity, from the 19th century was seen as a disease of the middle-to-upper class (The Crisis of Identity), and this notion of the elite as sensitive was a romanticized ideal going back to the 1700s with what Jane Austen referred to as ‘sensibility’ (see Bryan Kozlowski’s The Jane Austen Diet, as quoted in the link immediately above). In that same historical period, others noted that schizophrenia was spreading along with civilization (e.g., Samuel Gridley Howe and Henry Maudsley; see The Invisible Plague by Edwin Fuller Torrey & Judy Miller) and I’d add the point that there appear to be some overlapping factors between schizophrenia and autism — besides gluten, some of the implicated factors are glutamate, exorphins, inflammation, etc. “It is unlikely,” writes William Davis, “that wheat exposure was the initial cause of autism or ADHD but, as with schizophrenia, wheat appears to be associated with worsening characteristics of the conditions” (Wheat Belly, p. 48).

For most of human history, crop failures and famine were a regular occurrence. And this most harshly affected the poor masses when grain and bread prices went up, leading to food riots and sometimes revolutions (e.g., French Revolution). Before the 1800s, grains were so expensive that, in order to make them affordable, breads were often adulterated with fillers or entirely replaced with grain substitutes, the latter referred to as “famine breads” and sometimes made with tree bark. Even when available, the average person might be spending most of their money on bread, as it was one of the most costly foods around and other foods weren’t always easily obtained.

Even so, grain being highly sought after certainly doesn’t imply that the average person was eating a high-carb diet, quite the opposite (A Common Diet). Food in general was expensive and scarce and, among grains, wheat was the least common. At times, this would have forced feudal peasants and later landless peasants onto a diet limited in both carbohydrates and calories, which would have meant a typically ketogenic state (Fasting, Calorie Restriction, and Ketosis), albeit far from an optimal way of achieving it. The further back in time one looks the greater prevalence would have been ketosis (e.g., Spartan  and Mongol diet), maybe with the exception of the ancient Egyptians (Ancient Atherosclerosis?). In places like Ireland, Russia, etc, the lower classes remained on this poverty diet that was often a starvation diet well into the mid-to-late 1800s, although in the case of the Irish it was an artificially constructed famine as the potato crop was essentially being stolen by the English and sold on the international market.

Yet, in America, the poor were fortunate in being able to rely on a meat-based diet because wild game was widely available and easily obtained, even in cities. That may have been true for many European populations as well during earlier feudalism, specifically prior to the peasants being restricted in hunting and trapping on the commons. This is demonstrated by how health improved after the fall of the Roman Empire (Malnourished Americans). During this earlier period, only the wealthy could afford high-quality bread and large amounts of grain-based foods in general. That meant highly refined and fluffy white bread that couldn’t easily be adulterated. Likewise, for the early centuries of colonialism, sugar was only available to the wealthy — in fact, it was a controlled substance typically only found in pharmacies. But for the elite who had access, sugary pastries and other starchy dessert foods became popular. White bread and pastries were status symbols. Sugar was so scarce that wealthy households kept it locked away so the servants couldn’t steal it. Even fruit was disproportionately eaten by the wealthy. A fruit pie would truly have been a luxury with all three above ingredients combined in a single delicacy.

Part of the context is that, although grain yields had been increasing during the early colonial era, there weren’t dependable surplus yields of grains before the 1800s. Until then, white bread, pastries, and such simply were not affordable to most people. Consumption of grains, along with other starchy carbs and sugar, rose with 19th century advancements in agriculture. Simultaneously, income was increasing and the middle class was growing. But even as yields increased, most of the created surplus grains went to feeding livestock, not to feeding the poor. Grains were perceived as cattle feed. Protein consumption increased more than did carbohydrate consumption, at least initially. The American population, in particular, didn’t see the development of a high-carb diet until much later, as related to US mass urbanization also happening later.

Coming to the end of the 19th century, there was the emergence of the mass diet of starchy and sugary foods, especially the spread of wheat farming and white bread. And, in the US, only by the 20th century did grain consumption finally surpass meat consumption. Following that, there has been growing rates of autism. Along with sensory sensitivity, autistics are well known for their pickiness about foods and well known for cravings for particular foods such as those made from highly refined wheat flour, from white bread to crackers. Yet the folktales in question were speaking to a still living memory of an earlier time when these changes had yet to happen. Hans Christian Andersen first published “The Princess and the Pea” in 1835, but such stories had been orally told long before that, probably going back at least centuries, although we now know that some of these folktales have their origins millennia earlier, even into the Bronze Age. According to the Wikipedia article on “The Princess and the Pea”,

“The theme of this fairy tale is a repeat of that of the medieval Perso-Arabic legend of al-Nadirah.[6] […] Tales of extreme sensitivity are infrequent in world culture but a few have been recorded. As early as the 1st century, Seneca the Younger had mentioned a legend about a Sybaris native who slept on a bed of roses and suffered due to one petal folding over.[23] The 11th-century Kathasaritsagara by Somadeva tells of a young man who claims to be especially fastidious about beds. After sleeping in a bed on top of seven mattresses and newly made with clean sheets, the young man rises in great pain. A crooked red mark is discovered on his body and upon investigation a hair is found on the bottom-most mattress of the bed.[5] An Italian tale called “The Most Sensitive Woman” tells of a woman whose foot is bandaged after a jasmine petal falls upon it.”

I would take it as telling that, in the case of this particular folktale, it doesn’t appear to be as ancient as other examples. That would support my argument that the sensory sensitivity of autism might be caused by greater consumption of refined wheat, something that only began to appear late in the Axial Age and only became common much later. Even for the few wealthy that did have access in ancient times, they were eating rather limited amounts of white bread. It might have required hitting a certain level of intake, not seen until modernity or closer to it, before the extreme autistic symptoms became noticeable among a larger number of the aristocracy and monarchy.

* * *

Sources

Others have connected such folktales of sensitivity with autism:

The high cost and elite status of grains, especially white bread, prior to 19th century high yields:

The Life of a Whole Grain Junkie
by Seema Chandra

Did you know where the term refined comes from? Around 1826, whole grain bread used by the military was called superior for health versus the white refined bread used by the aristocracy. Before the industrial revolution, it was more labor consuming and more expensive to refine bread, so white bread was the main staple loaf for aristocracy. That’s why it was called “refined”.

The War on White Bread
by Livia Gershon

Bread has always been political. For Romans, it helped define class; white bread was for aristocrats, while the darkest brown loaves were for the poor. Later, Jacobin radicals claimed white bread for the masses, while bread riots have been a perennial theme of populist uprisings. But the political meaning of the staff of life changed dramatically in the early twentieth-century United States, as Aaron Bobrow-Strain, who went on to write the book White Bread, explained in a 2007 paper. […]

Even before this industrialization of baking, white flour had had its critics, like cracker inventor William Sylvester Graham. Now, dietary experts warned that white bread was, in the words of one doctor, “so clean a meal worm can’t live on it for want of nourishment.” Or, as doctor and radio host P.L. Clark told his audience, “the whiter your bread, the sooner you’re dead.”

Nutrition and Economic Development in the Eighteenth-Century Habsburg Monarchy: An Anthropometric History
by John Komlos
p.31

Furthermore, one should not disregard the cultural context of food consumption. Habits may develop that prevent the attainment of a level of nutritional status commensurate with actual real income. For instance, the consumption of white bread or of polished rice, instead of whole-wheat bread or unpolished rice, might increase with income, but might detract from the body’s well-being. Insofar as cultural habits change gradually over time, significant lags could develop between income and nutritional status.

pp. 192-194

As consequence, per capita food consumption could have increased between 1660 and 1740 by as much as 50 percent. The fact that real wages were higher in the 1730s than at any time since 1537 indicates a high standard of living was reached. The increase in grain exports, from 2.8 million quintals in the first decade of the eighteenth century to 6 million by the 1740s, is also indicative of the availability of nutrients.

The remarkably good harvests were brought about by the favorable weather conditions of the 1730s. In England the first four decades of the eighteenth century were much warmer than the last decades of the previous century (Table 5.1). Even small differences in temperature may have important consequences for production. […] As a consequence of high yields the price of consumables declined by 14 percent in the 1730s relative to the 1720s. Wheat cost 30 percent less in the 1730s than it did in the 1660s. […] The increase in wheat consumption was particularly important because wheat was less susceptible to mold than rye. […]

There is direct evidence that the nutritional status of many populations was, indeed, improving in the early part of the eighteenth century, because human stature was generally increasing in Europe as well as in America (see Chapter 2). This is a strong indication that protein and caloric intake rose. In the British colonies of North America, an increase in food consumption—most importantly, of animal protein—in the beginning of the eighteenth century has been directly documented. Institutional menus also indicate that diets improved in terms of caloric content.

Changes in British income distribution conform to the above pattern. Low food prices meant that the bottom 40 percent of the distribution was gaining between 1688 and 1759, but by 1800 had declined again to the level of 1688. This trend is another indication that a substantial portion of the population that was at a nutritional disadvantage was doing better during the first half of the eighteenth century than it did earlier, but that the gains were not maintained throughout the century.

The Roots of Rural Capitalism: Western Massachusetts, 1780-1860
By Christopher Clark
p. 77

Livestock also served another role, as a kind of “regulator,” balancing the economy’s need for sufficiency and the problems of producing too much. In good years, when grain and hay were plentiful, surpluses could be directed to fattening cattle and hogs for slaughter, or for exports to Boston and other markets on the hoof. Butter and cheese production would also rise, for sale as well as for family consumption. In poorer crop years, however, with feedstuffs rarer, cattle and swine could be slaughtered in greater numbers for household and local consumption, or for export as dried meat.

p. 82

Increased crop and livestock production were linked. As grain supplies began to overtake local population increases, more corn in particular became available for animal feed. Together with hay, this provided sufficient feedstuffs for farmers in the older Valley towns to undertake winter cattle fattening on a regular basis, without such concern as they had once had for fluctuations in output near the margins of subsistence. Winter fattening for market became an established practice on more farms.

When Food Changed History: The French Revolution
by Lisa Bramen

But food played an even larger role in the French Revolution just a few years later. According to Cuisine and Culture: A History of Food and People, by Linda Civitello, two of the most essential elements of French cuisine, bread and salt, were at the heart of the conflict; bread, in particular, was tied up with the national identity. “Bread was considered a public service necessary to keep the people from rioting,” Civitello writes. “Bakers, therefore, were public servants, so the police controlled all aspects of bread production.”

If bread seems a trifling reason to riot, consider that it was far more than something to sop up bouillabaisse for nearly everyone but the aristocracy—it was the main component of the working Frenchman’s diet. According to Sylvia Neely’s A Concise History of the French Revolution, the average 18th-century worker spent half his daily wage on bread. But when the grain crops failed two years in a row, in 1788 and 1789, the price of bread shot up to 88 percent of his wages. Many blamed the ruling class for the resulting famine and economic upheaval.
Read more: https://www.smithsonianmag.com/arts-culture/when-food-changed-history-the-french-revolution-93598442/#veXc1rXUTkpXSiMR.99
Give the gift of Smithsonian magazine for only $12! http://bit.ly/1cGUiGv
Follow us: @SmithsonianMag on Twitter

What Brought on the French Revolution?
by H.A. Scott Trask

Through 1788 and into 1789 the gods seemed to be conspiring to bring on a popular revolution. A spring drought was followed by a devastating hail storm in July. Crops were ruined. There followed one of the coldest winters in French history. Grain prices skyrocketed. Even in the best of times, an artisan or factor might spend 40 percent of his income on bread. By the end of the year, 80 percent was not unusual. “It was the connection of anger with hunger that made the Revolution possible,” observed Schama. It was also envy that drove the Revolution to its violent excesses and destructive reform.

Take the Reveillon riots of April 1789. Reveillon was a successful Parisian wall-paper manufacturer. He was not a noble but a self-made man who had begun as an apprentice paper worker but now owned a factory that employed 400 well-paid operatives. He exported his finished products to England (no mean feat). The key to his success was technical innovation, machinery, the concentration of labor, and the integration of industrial processes, but for all these the artisans of his district saw him as a threat to their jobs. When he spoke out in favor of the deregulation of bread distribution at an electoral meeting, an angry crowded marched on his factory, wrecked it, and ransacked his home.

Why did our ancestors prefer white bread to wholegrains?
by Rachel Laudan

Only in the late nineteenth and twentieth century did large numbers of “our ancestors”–and obviously this depends on which part of the world they lived in–begin eating white bread. […]

Wheat bread was for the few. Wheat did not yield well (only seven or eight grains for one planted compared to corn that yielded dozens) and is fairly tricky to grow.

White puffy wheat bread was for even fewer. Whiteness was achieved by sieving out the skin of the grain (bran) and the germ (the bit that feeds the new plant). In a world of scarcity, this made wheat bread pricey. And puffy, well, that takes fairly skilled baking plus either yeast from beer or the kind of climate that sourdough does well in. […]

Between 1850 and 1950, the price of wheat bread, even white wheat bread, plummeted in price as a result of the opening up of new farms in the US and Canada, Argentina, Australia and other places, the mechanization of plowing and harvesting, the introduction of huge new flour mills, and the development of continuous flow bakeries.

In 1800 only half the British population could afford wheat bread. In 1900 everybody could.

History of bread – Industrial age
The Industrial Age (1700 – 1887)
from The Federation of Bakers

In Georgian times the introduction of sieves made of Chinese silk helped to produce finer, whiter flour and white bread gradually became more widespread. […]

1757
A report accused bakers of adulterating bread by using alum lime, chalk and powdered bones to keep it very white. Parliament banned alum and all other additives in bread but some bakers ignored the ban. […]

1815
The Corn Laws were passed to protect British wheat growers. The duty on imported wheat was raised and price controls on bread lifted. Bread prices rose sharply. […]

1826
Wholemeal bread, eaten by the military, was recommended as being healthier than the white bread eaten by the aristocracy.

1834
Rollermills were invented in Switzerland. Whereas stonegrinding crushed the grain, distributing the vitamins and nutrients evenly, the rollermill broke open the wheat berry and allowed easy separation of the wheat germ and bran. This process greatly eased the production of white flour but it was not until the 1870s that it became economic. Steel rollermills gradually replaced the old windmills and watermills.

1846
With large groups of the population near to starvation the Corn Laws were repealed and the duty on imported grain was removed. Importing good quality North American wheat enabled white bread to be made at a reasonable cost. Together with the introduction of the rollermill this led to the increase in the general consumption of white bread – for so long the privilege of the upper classes.

Of all foods bread is the most noble: Carl von Linné (Carl Linneaus) on bread
by Leena Räsänen

In many contexts Linné explained how people with different standing in society eat different types of bread. He wrote, “Wheat bread, the most excellent of all, is used only by high-class people”, whereas “barley bread is used by our peasants” and “oat bread is common among the poor”. He made a remark that “the upper classes use milk instead of water in the dough, as they wish to have a whiter and better bread, which thereby acquires a more pleasant taste”. He compared his own knowledge on the food habits of Swedish society with those mentioned in classical literature. Thus, according to Linné, Juvenal wrote that “a soft and snow-white bread of the finest wheat is given to the master”, while Galen condemned oat bread as suitable only for cattle, not for humans. Here Linné had to admit that it is, however, consumed in certain provinces in Sweden.

Linné was aware of and discussed the consequences of consuming less tasty and less satisfying bread, but he seems to have accepted as a fact that people belonging to different social classes should use different foods to satisfy their hunger. For example, he commented that “bran is more difficult to digest than flour, except for hard-labouring peasants and the likes, who are scarcely troubled by it”. The necessity of having to eat filling but less palatable bread was inevitable, but could be even positive from the nutritional point of view. “In Östergötland they mix the grain with flour made from peas and in Scania with vetch, so that the bread may be more nutritious for the hard-working peasants, but at the same time it becomes less flavoursome, drier and less pleasing to the palate.” And, “Soft bread is used mainly by the aristocracy and the rich, but it weakens the gums and teeth, which get too little exercise in chewing. However, the peasant folk who eat hard bread cakes generally have stronger teeth and firmer gums”.

It is intriguing that Linné did not find it necessary to discuss the consumption or effect on health of other bakery products, such as the sweet cakes, tarts, pies and biscuits served by the fashion-conscious upper class and the most prosperous bourgeois. Several cookery books with recipes for the fashionable pastry products were published in Sweden in the eighteenth century 14. The most famous of these, Hjelpreda i Hushållningen för Unga Fruentimmer by Kajsa Warg, published in 1755, included many recipes for sweet pastries 15. Linné mentioned only in passing that the addition of egg makes the bread moist and crumbly, and sugar and currants impart a good flavour.

The sweet and decorated pastries were usually consumed with wine or with the new exotic beverages, tea and coffee. It is probable that Linné regarded pastries as unnecessary luxuries, since expensive imported ingredients, sugar and spices, were indispensable in their preparation. […]

Linné emphasized that soft and fresh bread does not draw in as much saliva and thus remains undigested for a long time, “like a stone in the stomach”. He strongly warned against eating warm bread with butter. While it was “considered as a delicacy, there was scarcely another food that was more damaging for the stomach and teeth, for they were loosen’d by it and fell out”. By way of illustration he told an example reported by a doctor who lived in a town near Amsterdam. Most of the inhabitants of this town were bakers, who sold bread daily to the residents of Amsterdam and had the practice of attracting customers with oven-warm bread, sliced and spread with butter. According to Linné, this particular doctor was not surprised when most of the residents of this town “suffered from bad stomach, poor digestion, flatulence, hysterical afflictions and 600 other problems”. […]

Linné was not the first in Sweden to write about famine bread. Among his remaining papers in London there are copies from two official documents from 1696 concerning the crop failure in the northern parts of Sweden and the possibility of preparing flour from different roots, and an anonymous small paper which contained descriptions of 21 plants, the roots or leaves of which could be used for flour 10. These texts had obviously been studied by Linné with interest.

When writing about substitute breads, Linné formulated his aim as the following: “It will teach the poor peasant to bake bread with little or no grain in the circumstance of crop failure without destroying the body and health with unnatural foods, as often happens in the countryside in years of hardship” 10.

Linné’s idea for a publication on bread substitutes probably originated during his early journeys to Lapland and Dalarna, where grain substitutes were a necessity even in good years. Actually, bark bread was eaten in northern Sweden until the late nineteenth century 4. In the poorest regions of eastern and north-eastern Finland it was still consumed in the 1920s 26. […]

Bark bread has been used in the subarctic area since prehistoric times 4. According to Linné, no other bread was such a common famine bread. He described how in springtime the soft inner layer can be removed from debarked pine trees, cleaned of any remaining bark, roasted or soaked to remove the resin, and dried and ground into flour. Linné had obviously eaten bark bread, since he could say that “it tastes rather well, is however more bitter than other bread”. His view of bark bread was most positive but perhaps unrealistic: “People not only sustain themselves on this, but also often become corpulent of it, indeed long for it.” Linné’s high regard for bark bread was shared by many of his contemporaries, but not all. For example, Pehr Adrian Gadd, the first professor of chemistry in Turku (Åbo) Academy and one of the most prominent utilitarians in Finland, condemned bark bread as “useless, if not harmful to use” 28. In Sweden, Anders Johan Retzius, a professor in Lund and an expert on the economic and pharmacological potential of Swedish flora, called bark bread “a paltry food, with which they can hardly survive and of which they always after some time get a swollen body, pale and bluish skin, big and hard stomach, constipation and finally dropsy, which ends the misery” 4. […]

Linné’s investigations of substitutes for grain became of practical service when a failed harvest of the previous summer was followed by famine in 1757 10. Linné sent a memorandum to King Adolf Fredrik in the spring of 1757 and pointed out the risk to the health of the hungry people when they ignorantly chose unsuitable plants as a substitute for grain. He included a short paper on the indigenous plants which in the shortage of grain could be used in bread-making and other cooking. His Majesty immediately permitted this leaflet to be printed at public expense and distributed throughout the country 10. Soon Linné’s recipes using wild flora were read out in churches across Sweden. In Berättelse om The inhemska wäxter, som i brist af Säd kunna anwändas til Bröd- och Matredning, Linné 32 described the habitats and the popular names of about 30 edible wild plants, eight of which were recommended for bread-making.

A Common Diet

“English peasants in Medieval times lived on a combination of meat stews, leafy vegetables and dairy products which scientists say was healthier than modern diets.”
~ Frédéric Leroy

There is an idea that, in the past, the poor were fed on bread while the rich monopolized meat. Whether or not this was true of some societies, it certainly wasn’t true of many. For example, in ancient Egypt, all levels of society seemed to have had the same basic high-carb diet with lots of bread. It consisted of the types and amounts of foods that are recommended in the USDA Food Pyramid. And their health suffered for it. As with people eating the same basic diet today, they had high rates of the diseases of civilization, specifically metabolic syndrome: obesity, diabetes, and heart disease. Also, they had serious tooth decay, something not seen with low-carb hunter-gatherers.

The main difference for ancient Egyptians was maybe the quality of bread. The same thing was true in Medieval Europe. Refined flour was limited to the wealthy. White breads didn’t become commonly available to most Westerners until the 1800s, about the same time that surplus grain harvests allowed for a high-carb diet and for the practice of fattening up cows with grains. Unsurprisingly, grain-fed humans also started become fat during this time with the earliest commentary on obesity coming from numerous writers of the era: Jane Austen, Jean Anthelme Brillat-Savarin, William Banting, etc.

In the Middle Ages, there were some other class differences in eating patterns. The basic difference is that the feudal serfs ate more salmon and aristocracy more chicken. It is not what a modern person would expect considering salmon is far more healthy, but the logic is that chickens were a rare commodity in that the poor wouldn’t want to regularly eat what produces the eggs they were dependent upon. Besides the bread issue, the Medieval aristocracy were also eating more sugary deserts. Back then, only the rich had access to or could afford sugar. Even fruit would have been rare for peasants.

Feudalism, especially early feudalism, was actually rather healthy for peasants. It’s not that anyone’s diet was exactly low-carb, at least not intentionally, although that would have been more true in the centuries of the early Middle Ages when populations returned to a more rural lifestyle of hunting, trapping and gathering, a time when any peasant had access to what was called the ‘commons’. But that did change over time as laws became more restrictive about land use. Still, in the centuries following the collapse of the Roman Empire, health and longevity drastically improved for most of the population.

The living conditions for the poor only got worse again as society moved toward modernity with the increase of large-scale agriculture and more processed foods. But even into the late Middle Ages, the diet remained relatively healthy since feudal laws protected the rights of commoners in raising their own food and grazing animals. Subsistence farming combined with some wild foods was not a bad way to feed a population, as long as there was enough land to go around.

A similar diet was maintained among most Americans until the 20th century when urbanization became the norm. As late as the Great Depression, much of the population was able to return to a rural lifestyle or otherwise had access to rural areas, as it was feasible with the then much smaller numbers. Joe Bageant describes his childhood in a West Virginia farming community from 1940s-to-1950s as still having been mostly subsistence farming with a barter economy. We’ve only seen the worst health outcomes among the poor since mass urbanization, which for African Americans only happened around the 1960s or 1970s when the majority finally became urbanized, centuries after it happened in Europe. The healthier diet of non-industrialized rural areas was a great equalizer for most of human existence.

The main thing I thought interesting was that diets didn’t always differ much between populations in the same society. The commonalities of a diet in any given era were greater than the differences. We now think of bread and refined flour as being cheap food, but at an earlier time such food would have been far more expensive and generally less available across all of society. As agriculture expanded, natural sources of food such as wild game became scarce and everyone became increasingly dependent on grains, along with legumes and tubers. This was a dramatic change with detrimental outcomes and it contributed to other larger changes going on in society.

The divergences of diets by class seems to primarily be a modern shift, including the access the upper classes now have to a diversity of fruits and vegetables, even out of season and grown in distant places. Perception of grains as poor people food and cattle feed only become a typical view starting in the 1800s, something discussed by Bryan Kozlowski in The Jane Austen Diet. As with the Roman Empire, the poorest of the poor lost access to healthy foods during the enclosure movement and extending into industrialization. It was only then that the modern high-carb diet became prevalent. It was also the first time that inequality had risen to such an extreme level, which forced a wedge into the once commonly held diet.

The early Middle Age communities (more akin to ancient city-states) established a more similar lifestyle between the rich and poor, as they literally lived close together, worshiped together, celebrated Carnival together, even ate together. A lord or knight would have maintained a retinue of advisers, assistants and servants plus a large number of dependents and workers who ate collective meals in the main house or castle. Later on, knights were no longer needed to defend communities and aristocracy became courtesans spending most of their time in the distant royal court. Then the enclosure movement created the landless peasants that would become the working poor. As class divides grew, diets diverged accordingly. We are so entrenched in a high inequality society, we have forgotten that this is severely abnormal compared to most societies throughout history. The result of greater inequality of wealth and power has been a worsening inequality of nutrition and health.

* * *

Reconciling organic residue analysis, faunal, archaeobotanical and historical records: Diet and the medieval peasant at West Cotton, Raunds, Northamptonshire
by J. Dunne, A. Chapman, P. Blinkhorn, R. P. Evershed

  • Medieval peasant diet comprises meat and cabbage stews cooked on open hearths.
  • Dairy products, butter and cheese, known as ‘white meats of the poor’ also eaten.

The medieval peasant diet that was ‘much healthier’ than today’s average eating habits: Staples of meat, leafy vegetables and cheese are found in residue inside 500-year-old pottery
by Joe Pinkstone

They found the surprisingly well-rounded diet of the peasants would have kept them well-fed and adequately nourished.

Dr Julie Dunne at the University of Bristol told MailOnline: ‘The medieval peasant had a healthy diet and wasn’t lacking in anything major!

‘It is certainly much healthier than the diet of processed foods many of us eat today.

‘The meat stews (beef and mutton) with leafy vegetables (cabbage, leek) would have provided protein and fibre and important vitamins and the dairy products (butter and ‘green’ cheeses) would also have provided protein and other important nutrients.

‘These dairy products were sometimes referred to as the “white meats” of the poor, and known to have been one of the mainstays of the medieval peasants diet. […]

Historical documents state that medieval peasants ate meat, fish, dairy products, fruit and vegetables.

But the researchers say that before their study there was little direct evidence to support this.

Carcinogenic Grains

In understanding human health, we have to look at all factors as a package deal. Our gut-brain is a system, as is our entire mind-body. Our relationships, lifestyle, the environment around us — all of it is inseparable. This is true even if we limit ourselves to diet alone. It’s not simply calories in/calories out, macronutrient ratios, or anything else along these lines. It is the specific foods eaten in combination with which other foods and in the context of stress, toxins, epigenetic inheritance, gut health, and so much else that determine what effects manifest in the individual.

There are numerous examples of this. But I’ll stick to a simple one, which involves several factors and the relationship between them. First, red meat is associated with cancer and heart disease. Yet causation is hard to prove, as red meat consumption is associated with many other foods in the standard American diet, such as added sugars and vegetable oils in processed foods. The association might be based on confounding factors that are culture-specific, which can explain why we find societies with heavy meat consumption and little cancer.

So, what else might be involved? We have to consider what red meat is being eaten with, at least in the standard American diet that is used as a control in most research. There is, of course, the added sugars and vegetable oils — they are seriously bad for health and may explain much of the confusion. Saturated fat intake has been dropping since the early 1900s and, in its place, there has been a steady rise in the use of vegetable oils; we now know that highly heated and hydrogenated vegetable oils do severe damage. Also, some of the original research that blamed saturated fat, when re-analyzed, found that sugar was the stronger correlation to heart disease.

Saturated fat, as with cholesterol, had been wrongly accused. This misunderstanding has, over multiple generations at this point, led to the early death of at least hundreds of millions of people worldwide, as dozens of the wealthiest and most powerful countries enforced this in their official dietary recommendations which transformed the world’s food system. Similar to eggs, red meat became the fall guy.

Such things as heart disease are related to obesity, and conventional wisdom tells us that fat makes us fat. Is that true? Not exactly or directly. I was amused to discover that a scientific report commissioned by the British government in 1846 (Experimental Researches on the Food of Animals, and the Fattening of Cattle: With Remarks on the Food of Man. Based Upon Experiments Undertaken by Order of the British Government by Robert Dundas Thomson) concluded that “The present experiments seem to demonstrate that the fat of animals cannot be produced from the oil of the food” — fat doesn’t make people fat, and that low-carb meat-eating populations tend to be slim has been observed for centuries.

So, in most cases, what does cause fat accumulation? It is only fat combined with plenty of carbs and sugar that is guaranteed to make us fat, that is to say fat in the presence of glucose in that the two compete as a fuel source.

Think about what an American meal with red meat looks like. A plate might have a steak with some rolls or slices of bread, combined with a potato and maybe some starchy ‘vegetables’ like corn, peas, or lima beans. Or there will be a hamburger with a bun, a side of fries, and a large sugary drink (‘diet’ drinks are no better, as we now know artificial sweeteners fool the body and so are just as likely to make you fat and diabetic). What is the common factor, red meat combined with wheat or some other grain, as part of a diet drenched in carbs and sugar (and all of it cooked or slathered in vegetable oils).

Most Americans have a far greater total intake of carbs, sugar, and vegetable oils than red meat and saturated fat. The preferred meat of Americans these days is chicken with fish also being popular. Why does red meat and saturated fat continue to be blamed for the worsening rates of heart disease and metabolic disease? It’s simply not rational, based on the established facts in the field of diet and nutrition. That isn’t to claim that too much red meat couldn’t be problematic. It depends on the total diet. Also, Americans have the habit of grilling their red meat and grilling increases carcinogens, which could be avoided by not charring one’s meat, but that equally applies to not burning (or frying) anything one eats, including white meat and plant foods. In terms of this one factor, you’d be better off eating beef roasted with vegetables than to go with a plant-based meal that included foods like french fries, fried okra, grilled vegetable shish kabobs, etc.

Considering all of that, what exactly is the cause of cancer that keeps showing up in epidemiological studies? Sarah Ballantyne has some good answers to that (see quoted passage below). It’s not so much about red meat itself as it is about what red meat is eaten with. The crux of the matter is that Americans eat more starchy carbs, mostly refined flour, than they do vegetables. What Ballantyne explains is that two of the potential causes of cancer associated with red meat only occur in a diet deficient in vegetables and abundant in grains. It is the total diet as seen in the American population that is the cause of high rates of cancer.

As a heavy meat diet without grains is not problematic, a heavy carb diet without grains is also not necessarily problematic. Some of the healthiest populations eat lots of carbs like sweet potatoes, but you won’t find any healthy population that eats as many grains as do Americans. There are many issues with grains considered in isolation (read the work of David Perlmutter or any number of writers on the paleo diet), but grains combined with certain other foods in particular can contribute to health concerns.

Then again, some of this is about proportion. For most of the time of agriculture, humans ate small amounts of grains as an occasional food. Grains tended to be stored for hard times or for trade or else turned into alcohol to be mixed with water from unclean sources. The shift to large amounts of grains made into refined flour is an evolutionarily unique dilemma our bodies aren’t designed to handle. The first accounts of white bread are found in texts from slightly over two millennia ago and most Westerners couldn’t afford white bread until the past few centuries when industrialized milling began. Before that, people tended to eat foods that were available and didn’t mix them as much (e.g., eat fruits and vegetables in season). Hamburgers were invented only about a century ago. The constant combining of red meat and grains is not something we are adapted for. That harm to our health results maybe shouldn’t surprise us.

Red meat can be a net loss to health or a net gain. It depends not on the red meat, but what is and isn’t eaten with it. Other factors matter as well. Health can’t be limited to a list of dos and don’ts, even if such lists have their place in the context of more detailed knowledge and understanding. The simplest solution is to eat as most humans ate for hundreds of thousands of years, and more than anything else that means avoiding grains. Even without red meat, many people have difficulties with grains.

Let’s return to the context of evolution. Hominids have been eating fatty red meat for millions of years (early humans having prized red meat from blubbery megafauna until their mass extinction), and yet meat-eating hunter-gatherers rarely get cancer, heart disease, or any of the other modern ailments. How long ago was it when the first humans ate grains? About 12 thousand years ago. Most humans on the planet never touched a grain until the past few millennia. And fewer still included grains with almost every snack and meal until the past few generations. So, what is this insanity of government dietary recommendations putting grains as the base of the food pyramid? Those grains are feeding the cancerous microbes, and doing much else that is harmful.

In conclusion, is red meat bad for human health? It depends. Red meat that is charred or heavily processed combined with wheat and other carbs, lots of sugar and vegetable oils, and few nutritious vegetables, well, that would be a shitty diet that will inevitably lead to horrible health consequences. Then again, the exact same diet minus the red meat would still be a recipe for disease and early death. Yet under other conditions, red meat can be part of a healthy diet. Even a ton of pasture-raised red meat (with plenty of nutrient-dense organ meats) combined with an equal amount of organic vegetables (grown on healthy soil, bought locally, and eaten in season), in exclusion of grains especially refined flour and with limited intake of all the other crap, that would be one of the healthiest diets you could eat.

On the other hand, if you are addicted to grains as many are and can’t imagine a world without them, you would be wise to avoid red meat entirely. Assuming you have any concerns about cancer, you should choose one or the other but not both. I would note, though, that there are many other reasons to avoid grains while there are no other known reasons to avoid red meat, at least for serious health concerns, although some people exclude red meat for other reasons such as digestion issues. The point is that whether or not you eat red meat is a personal choice (based on taste, ethics, etc), not so much a health choice, as long as we separate out grains. That is all we can say for certain based on present scientific knowledge.

* * *

We’ve known about this for years now. Isn’t it interesting that no major health organization, scientific institution, corporate news outlet, or government agency has ever warned the public about the risk factors of carcinogenic grains? Instead, we get major propaganda campaigns to eat more grains because that is where the profit is for big ag, big food, and big oil (that makes farm chemicals and transports the products of big ag and big food). How convenient! It’s nice to know that corporate profit is more important than public health.

But keep listening to those who tell you that cows are destroying the world, even though there are fewer cows in North America than there once were buffalo. Yeah, monocultural GMO crops immersed in deadly chemicals that destroy soil and deplete nutrients are going to save us, not traditional grazing land that existed for hundreds of millions of years. So, sure, we could go on producing massive yields of grains in a utopian fantasy beloved by technocrats and plutocrats that further disconnects us from the natural world and our evolutionary origins, an industrial food system dependent on turning the whole world into endless monocrops denatured of all other life, making entire regions into ecological deserts that push us further into mass extinction. Or we could return to traditional ways of farming and living with a more traditional diet largely of animal foods (meat, fish, eggs, dairy, etc) balanced with an equal amount of vegetables, the original hunter-gatherer diet.

Our personal health is important. And it is intimately tied to the health of the earth. Civilization as we know it was built on grains. That wasn’t necessarily a problem when grains were a small part of the diet and populations were small. But is it still a sustainable socioeconomic system as part of a healthy ecological system? No, it isn’t. So why do we continue to do more of the same that caused our problems in the hope that it will solve our problems? As we think about how different parts of our diet work together to create conditions of disease or health, we need to begin thinking this way about our entire world.

* * *

Paleo Principles
by Sarah Ballantyne

While this often gets framed as an argument for going vegetarian or vegan. It’s actually a reflection of the importance of eating plenty of plant foods along with meat. When we take a closer look at these studies, we see something extraordinarily interesting: the link between meat and cancer tends to disappear once the studies adjust for vegetable intake. Even more exciting, when we examine the mechanistic links between meat and cancer, it turns out that many of the harmful (yes, legitimately harmful!) compounds of meat are counteracted by protective compounds in plant foods.

One major mechanism linking meat to cancer involves heme, the iron-containing compound that gives red meat its color (in contrast to the nonheme iron found in plant foods). Where heme becomes a problem is in the gut: the cells lining the digestive tract (enterocytes) metabolize it into cytotoxic compounds (meaning toxic to living cells), which can then damage the gut barrier (specifically the colonic mucosa; see page 67), cause cell proliferation, and increase fecal water toxicity—all of which raise cancer risk. Yikes! In fact, part of the reason red meat is linked with cancer far more often than with white meat could be due to their differences in heme content; white meat (poultry and fish) contains much, much less.

Here’s where vegetables come to the rescue! Chlorophyll, the pigment in plants that makes them green, has a molecular structure that’s very similar to heme. As a result, chlorophyll can block the metabolism of heme in the intestinal tract and prevent those toxic metabolites from forming. Instead of turning into harmful by-products, heme ends up being metabolized into inert compounds that are no longer toxic or damaging to the colon. Animal studies have demonstrated this effect in action: one study on rats showed that supplementing a heme-rich diet with chlorophyll (in the form of spinach) completely suppressed the pro-cancer effects of heme. All the more reason to eat a salad with your steak.

Another mechanism involves L-carnitine, an amino acid that’s particularly abundant in red meat (another candidate for why red meat seems to disproportionately increase cancer risk compared to other meats). When we consume L-carnitine, our intestinal bacteria metabolize it into a compound called trimethylamine (TMA). From there, the TMA enters the bloodstream and gets oxydized by the liver into yet another compound, trimethylamine-N-oxide (TMAO). This is the one we need to pay attention to!

TMAO has been strongly linked to cancer and heart disease, possibly due to promoting inflammation and altering cholesterol transport. Having high levels of it in the bloodstream could be a major risk factor for some chronic diseases. So is this the nail in the coffin for meat eaters?

Not so fast! An important study on this topic published in 2013 in Nature Medicine sheds light on what’s really going on. This paper had quite a few components, but one of the most interesting has to do with gut bacteria. Basically, it turns out that the bacteria group Prevotella is a key mediator between L-carnitine consumption and having high TMAO levels in our blood. In this study, the researchers found that participants with gut microbiomes dominated by Prevotella produced the most TMA (and therefore TMAO, after it reached the liver) from the L-carnitine they ate. Those with microbiomes high in Bacteroides rather than Prevotella saw dramatically less conversion to TMA and TMAO.

Guess what Prevotella loves to snack on? Grains! It just so happens that people with high Prevotella levels, tend to be those who eat grain-based diets (especially whole grain), since this bacterial group specializes in fermenting the type of polysaccharides abundant in grain products. (For instance, we see extremely high levels of Prevotella in populations in rural Africa that rely on cereals like millet and sorghum.) At the same time, Prevotella doesn’t seem to be associated with a high intake of non-grain plant sources, such as fruit and vegetables.

So is it really the red meat that’s a problem . . . or is it the meat in the context of a grain-rich diet? Based on the evidence we have so far, it seems that grains (and the bacteria that love to eat them) are a mandatory part of the L-carnitine-to-TMAO pathway. Ditch the grains, embrace veggies, and our gut will become a more hospitable place for red meat!

* * *

Georgia Ede has a detailed article about the claim of meat causing cancer. In it, she provides several useful summaries of and quotes from the scientific literature.

WHO Says Meat Causes Cancer?

In November 2013, 23 cancer experts from eight countries gathered in Norway to examine the science related to colon cancer and red/processed meat. They concluded:

“…the interactions between meat, gut and health outcomes such as CRC [colorectal cancer] are very complex and are not clearly pointing in one direction….Epidemiological and mechanistic data on associations between red and processed meat intake and CRC are inconsistent and underlying mechanisms are unclear…Better biomarkers of meat intake and of cancer occurrence and updated food composition databases are required for future studies.” 1) To read the full report: http://www.ncbi.nlm.nih.gov/pubmed/24769880 [open access]

Translation: we don’t know if meat causes colorectal cancer. Now THAT is a responsible, honest, scientific conclusion.

How the WHO?

How could the WHO have come to such a different conclusion than this recent international gathering of cancer scientists? As you will see for yourself in my analysis below, the WHO made the following irresponsible decisions:

  1. The WHO cherry-picked studies that supported its anti-meat conclusions, ignoring those that showed either no connection between meat and cancer or even a protective effect of meat on colon cancer risk. These neutral and protective studies were specifically mentioned within the studies cited by the WHO (which makes one wonder whether the WHO committee members actually read the studies referenced in its own report).
  2. The WHO relied heavily on dozens of “epidemiological” studies (which by their very nature are incapable of demonstrating a cause and effect relationship between meat and cancer) to support its claim that meat causes cancer.
  3. The WHO cited a mere SIX experimental studies suggesting a possible link between meat and colorectal cancer, four of which were conducted by the same research group.
  4. THREE of the six experimental studies were conducted solely on RATS. Rats are not humans and may not be physiologically adapted to high-meat diets. All rats were injected with powerful carcinogenic chemicals prior to being fed meat. Yes, you read that correctly.
  5. Only THREE of the six experimental studies were human studies. All were conducted with a very small number of subjects and were seriously flawed in more than one important way. Examples of flaws include using unreliable or outdated biomarkers and/or failing to include proper controls.
  6. Some of the theories put forth by the WHO about how red/processed meat might cause cancer are controversial or have already been disproved. These theories were discredited within the texts of the very same studies cited to support the WHO’s anti-meat conclusions, again suggesting that the WHO committee members either didn’t read these studies or deliberately omitted information that didn’t support the WHO’s anti-meat position.

Does it matter whether the WHO gets it right or wrong about meat and cancer? YES.

“Strong media coverage and ambiguous research results could stimulate consumers to adapt a ‘safety first’ strategy that could result in abolishment of red meat from the diet completely. However, there are reasons to keep red meat in the diet. Red meat (beef in particular) is a nutrient dense food and typically has a better ratio of N6:N3-polyunsaturated fatty acids and significantly more vitamin A, B6 and B12, zinc and iron than white meat(compared values from the Dutch Food Composition Database 2013, raw meat). Iron deficiencies are still common in parts of the populations in both developing and industrialized countries, particularly pre-school children and women of childbearing age (WHO)… Red meat also contains high levels of carnitine, coenzyme Q10, and creatine, which are bioactive compounds that may have positive effects on health.” 2)

The bottom line is that there is no good evidence that unprocessed red meat increases our risk for cancer. Fresh red meat is a highly nutritious food which has formed the foundation of human diets for nearly two million years. Red meat is a concentrated source of easily digestible, highly bioavailable protein, essential vitamins and minerals. These nutrients are more difficult to obtain from plant sources.

It makes no sense to blame an ancient, natural, whole food for the skyrocketing rates of cancer in modern times. I’m not interested in defending the reputation of processed meat (or processed foods of any kind, for that matter), but even the science behind processed meat and cancer is unconvincing, as I think you’ll agree. […]

Regardless, even if you believe in the (non-existent) power of epidemiological studies to provide meaningful information about nutrition, more than half of the 29 epidemiological studies did NOT support the WHO’s stance on unprocessed red meat and colorectal cancer.

It is irresponsible and misleading to include this random collection of positive and negative epidemiological studies as evidence against meat.

The following quote is taken from one of the experimental studies cited by the WHO. The authors of the study begin their paper with this striking statement:

“In puzzling contrast with epidemiological studies, experimental studies do not support the hypothesis that red meat increases colorectal cancer risk. Among the 12 rodent studies reported in the literature, none demonstrated a specific promotional effect of red meat.” 3)

[Oddly enough, none of these twelve “red meat is fine” studies, which the authors went on to list and describe within the text of the introduction to this article, were included in the WHO report].

I cannot emphasize enough how common it is to see statements like this in scientific papers about red meat. Over and over again, researchers see that epidemiology suggests a theoretical connection between some food and some health problem, so they conduct experiments to test the theory and find no connection. This is why our nutrition headlines are constantly changing. One day eggs are bad for you, the next day they’re fine. Epidemiologists are forever sending well-intentioned scientists on time-consuming, expensive wild goose chases, trying to prove that meat is dangerous, when all other sources–from anthropology to physiology to biochemistry to common sense—tell us that meat is nutritious and safe.

* * *

Below good discussion between Dr. Steven Gundry and Dr. Paul Saladino. It’s an uncommon dialogue. Even though Gundry is known for warning against the harmful substances in plant foods, he has shifted toward a plant-based diet in also warning against too much animal foods or at least too much protein, another issue about IGF1 not relevant to this post. As for Saladino, he is a carnivore and so takes Gundry’s argument against plants to a whole other level. Saladino sees no problem with meat, of course. And his view contradicts what Gundry writes about in his most recent book, The Longevity Paradox.

Anyway, they got onto the topic of TMAO. Saladino points out that fish has more fully formed TMAO than red meat produces in combination with grain-loving Prevotella. Even vegetables produce TMAO. So, why is beef being scapegoated? It’s pure ignorant idiocy. To further this point, Saladino explained that he has tested the microbiome of patients of his on the carnivore diet and it comes up low on the Prevotella bacteria. He doesn’t think TMAO is the danger people claim it is. But even if it were, the single safest diet might be the carnivore diet.

Gundry didn’t even disagree. He pointed out that he did testing on patients of his who are long-term vegans and now in their 70s. They had extremely high levels of TMAO. He sent their lab results to the Cleveland Clinic for an opinion. The experts there refused to believe that it was possible and so dismissed the evidence. That is the power of dietary ideology when it forms a self-enclosed reality tunnel. Red meat is bad and vegetables are good. The story changes over time. It’s the saturated fat. No, it’s the TMAO. Then it will be something else. Always looking for a rationalization to uphold the preferred dogma.

* * *

7/25/19 – Additional thoughts: There is always new research coming out. And as is typical, it is often contradictory. It is hard to know what is being studied exactly.The most basic understanding in mainstream nutrition right now seems to be that red meat is associated with TMAO by way of carnitine and Prevotella (Studies reveal role of red meat in gut bacteria, heart disease development). But there are many assumptions being made. This research tends to be epidemiological/observational and so most factors aren’t being controlled.

Worse still, they aren’t comparing the equivalent extremes, such as veganism vs carnivory but veganism and vegetarianism vs omnivory. That is to leave out the even greater complicating factor that, as the data shows, a significant number of vegans and vegetarians occasionally eat animal foods. There really aren’t that many long-term vegans and vegetarians to study because 80% of people who start the diet quit it, and of that 20% few are consistent.

As for omnivores, they are a diverse group that could include hundreds of dietary variations. One variety of omnivory is the paleo diet, slightly restricted omnivory in that grains are excluded, often along with legumes, white potatoes, dairy, added sugar, etc. The paleo diet was studied and showed higher levels of TMAO and, rather than cancer, the focus was on cardiovascular disease (Heart disease biomarker linked to paleo diet).

So, that must mean the paleo diet is bad, right? When people think of the paleo diet, they think of a caveman lugging a big hunk of meat. But the reality is that the standard paleo diet, although including red meat, emphasizes fish and heaping platefuls of vegetables. Why is red meat getting blamed? In a bizarre twist, the lead researcher of the paleo study, Dr. Angela Genoni, thought the problem was the lack of grains. But it precisely grains that the TMAO-producing Prevotella gut bacteria love so much. How could reducing grains increase TMAO? No explanation was offered. Before we praise grains, why not look at the sub-population of vegans, vegetarians, fruitivores, etc who also avoid grains?

There is a more rational and probable factor. It turns out that fish and vegetables raise TMAO levels higher than red meat (Eat your vegetables (and fish): Another reason why they may promote heart health). This solves the mystery of why some Dr. Gundry’s vegan patients had high TMAO levels. Yet, in another bizarre leap of logic, the same TMAO that is used to castigate red meat suddenly is portrayed as healthy in reducing cardiovascular risk when it comes from sources other than red meat. It is the presence of red meat that somehow magically transforms TMAO into an evil substance that will kill you. Or maybe, just maybe it has nothing directly to do with TMAO alone.

After a long and detailed analysis of the evidence, Dr. Georgia Ede concluded that, “As far as I can tell, the authors’ theory that red meat provides carnitine for bacteria to transform into TMA which our liver then converts to TMAO, which causes our macrophages to fill up with cholesterol, block our arteries, and cause heart attacks is just that–a theory–full of sound and fury, signifying nothing” (Does Carnitine from Red Meat Cause Heart Disease?).

 

Malnourished Americans

Prefatory Note

It would be easy to mistake this writing as a carnivore’s rhetoric against the evils of grains and agriculture. I’m a lot more agnostic on the issue than it might seem. But I do come off as strong in opinion, from decades of personal experience about bad eating habits and the consequences, and my dietary habits were no better when I was vegetarian.

I’m not so much pro-meat as I am for healthy fats and oils, not only from animals sources but also from plants, with coconut oil and olive oil being two of my favorites. As long as you are getting adequate protein, from whatever source (including vegetarian foods), there is no absolute rule about protein intake. But hunter-gatherers on average do eat more fats and oils than protein (and more than vegetables as well), whether the protein comes from meat or seeds and nuts (though the protein and vegetables they get is of extremely high quality and, of course, nutrient dense; along with much fiber). Too much protein with too little fat/oil causes rabbit sickness. It’s fat and oil that has a higher satiety and, combined with low-carb ketosis, is amazing in eliminating food cravings, addictions, and over-eating.

Besides, I have nothing against plant-based foods. I eat more vegetables on the paleo diet than I did in the past, even when I was a vegetarian, more than any vegetarian I know as well; not just more in quantity but also more in quality. Many paleo and keto dieters have embraced a plant-based diet with varying attitudes about meat and fat. Dr. Terry Wahls, former vegetarian, reversed her symptoms of multiple sclerosis by formulating a paleo diet that include massive loads of nutrient-dense vegetables, while adding in the nutrient-dense animal foods as well (e.g., liver).

I’ve picked up three books lately that emphasize plants even further. One is The Essential Vegetarian Keto Cookbook and pretty much is as the title describes it, mostly recipes with some introductory material about ketosis. Another book, Ketotarian by Dr. Will cole, is likewise about keto vegetarianism, but with leniency toward fish consumption and ghee (the former not strictly vegetarian and the latter not strictly paleo). The most recent I got is The Paleo Vegetarian Diet by Dena Harris, another person with a lenient attitude toward diet. That is what I prefer in my tendency toward ideological impurity. About diet, I’m bi-curious or maybe multi-curious.

My broader perspective is that of traditional foods. This is largely based on the work of Weston A. Price, which I was introduced to long ago by way of the writings of Sally Fallon Morrell (formerly Sally Fallon). It is not a paleo diet in that agricultural foods are allowed, but its advocates share a common attitude with paleolists in the valuing of traditional nutrition and food preparation. Authors from both camps bond over their respect for Price’s work and so often reference those on the other side in their writings. I’m of the opinion, in line with traditional foods, that if you are going to eat agricultural foods then traditional preparation is all the more important (from long-fermented bread and fully soaked legumes to cultured dairy and raw aged cheese). Many paleolists share this opinion and some are fine with such things as ghee. My paleo commitment didn’t stop me from enjoying a white role for Thanksgiving, adorning it with organic goat butter, and it didn’t kill me.

I’m not so much arguing against all grains in this post as I’m pointing out the problems found at the extreme end of dietary imbalance that we’ve reached this past century: industrialized and processed, denatured and toxic, grain-based/obsessed and high-carb-and-sugar. In the end, I’m a flexitarian who has come to see the immense benefits in the paleo approach, but I’m not attached to it as a belief system. I heavily weigh the best evidence and arguments I can find in coming to my conclusions. That is what this post is about. I’m not trying to tell anyone how to eat. I hope that heads off certain areas of potential confusion and criticism. So, let’s get to the meat of the matter.

Grain of Truth

Let me begin with a quote, share some related info, and then circle back around to putting the quote into context. The quote is from Grain of Truth by Stephen Yafa. It’s a random book I picked up at a secondhand store and my attraction to it was that the author is defending agriculture and grain consumption. I figured it would be a good balance to my other recent readings. Skimming it, one factoid stuck out. In reference to new industrial milling methods that took hold in the late 19th century, he writes:

“Not until World War II, sixty years later, were measures taken to address the vitamin and mineral deficiencies caused by these grain milling methods. They caught the government’s attention only when 40 percent of the raw recruits drafted by our military proved to be so malnourished that they could not pass a physical and were declared unfit for duty.” (p. 17)

That is remarkable. He is talking about the now infamous highly refined flour, something that never existed before. Even commercial whole wheat breads today, with some fiber added back in, have little in common with what was traditionally made for millennia. My grandparents were of that particular generation that was so severely malnourished, and so that was the world into which my parents were born. The modern health decline that has gained mainstream attention began many generations back. Okay, so put that on the backburner.

Against the Grain

In a post by Dr. Malcolm Kendrick, I was having a discussion in the comments section (and, at the same time, I was having a related discussion in my own blog). Göran Sjöberg brought up Jame C. Scott’s book about the development of agriculture, Against the Grain — writing that, “This book is very much about the health deterioration, not least through epidemics partly due to compromised immune resistance, that occurred in the transition from hunting and gathering to sedentary mono-crop agriculture state level scale, first in Mesopotamia about five thousand years ago.”

Scott’s view has interested me for a while. I find compelling the way he connects grain farming, legibility, record-keeping, and taxation. There is a reason great empires were built on grain fields, not on potato patches or vegetable gardens, much less cattle ranching. Grain farming is easily observed and measured, tracked and recorded, and that meant it could be widely taxed to fund large centralized governments along with their armies and, later on, their police forces and intelligence agencies. The earliest settled societies arose prior to agriculture, but they couldn’t become major civilizations until the cultivation of grains.

Another commenter, Sasha, responded with what she considered important qualifications: “I think there are too many confounders in transition from hunter gatherers to agriculture to suggest that health deterioration is due to one factor (grains). And since it was members of upper classes who were usually mummified, they had vastly different lifestyles from that of hunter gatherers. IMO, you’re comparing apples to oranges… Also, grain consumption existed in hunter gatherers and probably intensified long before Mesopotamia 5 thousands years ago as wheat was domesticated around 9,000 BCE and millet around 6,000 BCE to use just two examples.”

It is true that pre-neolithic hunter-gatherers, in some cases, sporadically ate grains in small amounts or at least we have evidence they were doing something with grains, though as far as we know they might have been using it to mix with medicinal herbs or used as a thickener for paints — it’s anyone’s guess. Assuming they were eating those traces of grains we’ve discovered, it surely was no where near at the level of the neolithic agriculturalists. Furthermore, during the following millennia, grains were radically changed through cultivation. As for the Egyptian elite, they were eating more grains than anyone, as farmers were still forced to partly subsist from hunting, fishing, and gathering.

I’d take the argument much further forward into history. We know from records that, through the 19th century, Americans were eating more meat than bread. Vegetable and fruit consumption was also relatively low and mostly seasonal. Part of that is because gardening was difficult with so many pests. Besides, with so many natural areas around, hunting and gathering remained a large part of the American diet. Even in the cities, wild game was easily obtained at cheap prices. Into the 20th century, hunting and gathering was still important and sustained many families through the Great Depression and World War era when many commercial foods were scarce.

It was different in Europe, though. Mass urbanization happened centuries before it did in the United States. And not much European wilderness was left standing in recent history. But with the fall of the Roman Empire and headng into feudalism, many Europeans returned to a fair amount of hunting and gathering, during which time general health improved in the population. Restrictive laws about land use eventually made that difficult and the land enclosure movement made it impossible for most Europeans.

Even so, all of that is fairly recent in the big scheme of things. It took many millennia of agriculture before it more fully replaced hunting, fishing, trapping, and gathering. In places like the United States, that change is well within living memory. When some of my ancestors immigrated here in the 1600s, Britain and Europe still maintained plenty of procuring of wild foods to support their populations. And once here, wild foods were even more plentiful and a lot less work than farming.

Many early American farmers didn’t grow food so much for their own diet as to be sold on the market, sometimes in the form of the popular grain-based alcohols. It was in making alcohol that rural farmers were able to get their product to the market without it spoiling. I’m just speculating, but alcohol might have been the most widespread agricultural food of that era because water was often unsafe to drink.

Another commenter, Martin Back, made the same basic point: “Grain these days is cheap thanks to Big Ag and mechanization. It wasn’t always so. If the fields had to be ploughed by draught animals, and the grain weeded, harvested, and threshed by hand, the final product was expensive. Grain became a store of value and a medium of exchange. Eating grains was literally like eating money, so presumably they kept consumption to a minimum.”

In early agriculture, grain was more of a way to save wealth than a staple of the diet. It was saved for purposes of trade and also saved for hard times when no other food was available. What didn’t happen was to constantly consume grain-based foods every day and all day long — going from a breakfast with toast and cereal to lunch with a sandwich and maybe a salad with croutons, and then a snack of crackers in the afternoon before eating more bread or noodles for dinner.

Historical Examples

So, I am partly just speculating. But it’s informed speculation. I base my view on specific examples. The most obvious example are hunter-gatherers, poor by standards of modern industrialization while maintaining great health, as long as they their traditional way of life is able to be maintained. Many populations that are materially better of in terms of a capitalist society (access to comfortable housing, sanitation, healthcare, an abundance of food in grocery stores, etc) are not better off in terms of chronic diseases.

As the main example I already mentioned, poor Americans have often been a quite healthy lot, as compared to other populations around the world. It is true that poor Americans weren’t particularly healthy in the early colonial period, specifically in Virginia because of indentured servitude. And it’s true that poor Americans today are fairly bad off because of the cheap industrialized diet. Yet for the couple of centuries or so in between, they were doing quite well in terms of health, with lots of access to nutrient-dense wild foods. That point is emphasized by looking at other similar populations at the time, such as back in Europe.

Let’s do some other comparisons. The poor in the Roman Empire did not do well, even when they weren’t enslaved. That was for many reasons, such as growing urbanization and its attendant health risks. When the Roman Empire fell, many of the urban centers collapsed. The poor returned to a more rural lifestyle that depended on a fair amount of wild foods. Studies done on their remains show their health improved during that time. Then at the end of feudalism, with the enclosure movement and the return of mass urbanization, health went back on a decline.

Now I’ll consider the early Egyptians. I’m not sure if there is any info about the diet and health of poor Egyptians. But clearly the ruling class had far from optimal health. It’s hard to make comparisons between then and now, though, because it was an entire different kind of society. The early Bronze Age civilizations were mostly small city-states that lacked much hierarchy. Early Egypt didn’t even have the most basic infrastructure such as maintained roads and bridges. And the most recent evidence indicates that the pyramid workers weren’t slaves but instead worked freely and seem to have fed fairly well, whatever that may or may not indicate about their socioeconomic status. The fact that the poor weren’t mummified leaves us with scant evidence that would more directly inform us.

On the other hand, no one can doubt that there have been plenty of poor populations who had truly horrific living standards with much sickness, suffering, and short lifespans. That is particularly true over the millennia as agriculture became ever more central, since that meant periods of abundance alternating with periods of deficiency and sometimes starvation, often combined with weakened immune systems and rampant sickness. That was less the case for the earlier small city-states with less population density and surrounded by the near constant abundance of wilderness areas.

As always, it depends on what are the specifics we are talking about. Also, any comparison and conclusion is relative.

My mother grew up in a family that hunted and at the time there was a certain amount of access to natural areas for many Americans, something that helped a large part of the population get through the Great Depression and world war era. Nonetheless, by the time of my mother’s childhood, overhunting had depleted most of the wild game (bison, bear, deer, etc were no longer around) and so her family relied on less desirable foods such as squirrel, raccoon, and opossum; even the fish they ate was less than optimal because they came from highly polluted waters because of the very factories and railroad her family worked in. So, the wild food opportunities weren’t nearly as good as it had been a half century earlier, much less in the prior centuries.

Not All Poverty is the Same

Being poor today means a lot of things that it didn’t mean in the past. The high rates of heavy metal toxicity today has rarely been seen among previous poor populations. Today 40% of the global deaths are caused by air pollution, primarily effecting the poor, also extremely different from the past. Beyond that, inequality has grown larger than ever before and that has been strongly correlated to high rates of stress, disease, homicides, and suicides. Such inequality is also seen in terms of climate change, droughts, refugee crises, and war/occupation.

Here is what Sasha wrote in response to me: “I agree with a lot of your points, except with your assertion that “the poor ate fairly well in many societies especially when they had access to wild sources of food”. I know how the poor ate in Russia in the beginning of the 20th century and how the poor eat now in the former Soviet republics and in India. Their diet is very poor even though they can have access to wild sources of food. I don’t know what the situation was for the poor in ancient Egypt but I would be very surprised if it was better than in modern day India or former Soviet Union.”

I’d imagine modern Russia has high inequality similar to the US. About modern India, that is one of the most impoverished, densely populated, and malnourished societies around. And modern industrialization did major harm to Hindu Indians because studies show that traditional vegetarians got a fair amount of nutrients from the insects that were mixed in with pre-modern agricultural goods. Both Russia and India have other problems related to neoliberalism that wasn’t a factor in the past. It’s an entirely different kind of poverty these days. Even if some Russians have some access to wild foods, I’m willing to bet they have nowhere near the access that was available in previous generations, centuries, and millennia.

Compare modern poverty to that of feudalism. At least in England, feudal peasants were guaranteed to be taken care of in hard times. The Church, a large part of local governance at the time, was tasked with feeding and taking care of the poor and needy, from orphans to widows. They were tight communities that took care of their own, something that no longer exists in most of the world where the individual is left to suffer and struggle. Present Social Darwinian conditions are not the norm for human societies across history. The present breakdown of families and communities is historically unprecedented.

Socialized Medicine & Externalized Costs
An Invisible Debt Made Visible
On Conflict and Stupidity
Inequality in the Anthropocene
Capitalism as Social Control

The Abnormal Norms of WEIRD Modernity

Everything about present populations is extremely abnormal. This is seen in diet as elsewhere. Let me return to the quote I began this post with. “Not until World War II, sixty years later, were measures taken to address the vitamin and mineral deficiencies caused by these grain milling methods. They caught the government’s attention only when 40 percent of the raw recruits drafted by our military proved to be so malnourished that they could not pass a physical and were declared unfit for duty.” * So, what had happened to the health of the American population?

Well, there were many changes. Overhunting, as I already said, made many wild game species extinct or eliminated them from local areas, such that my mother born into a rural farm state never saw a white-tailed deer growing up. Also, much earlier after the Civil War, a new form of enclosure movement happened as laws were passed to prevent people, specifically the then free blacks, from hunting and foraging wherever they wanted (early American laws often protected the rights of anyone to hunt, forage plants, collect timber, etc from any land that was left open, whether or not it was owned by someone). The carryover from the feudal commons was finally and fully eliminated. It was also the end of the era of free range cattle ranching, the ending have come with the invention of barbed wire. Access to wild foods was further reduced by the creation and enforcement of protected lands (e.g., the federal park system), which very much was targeted at the poor who up to that point had relied upon wild foods for health and survival.

All of that was combined with mass urbanization and industrialization with all of its new forms of pollution, stress, and inequality. Processed foods were becoming more widespread at the time. Around the turn of the century unhealthy and industrialized vegetable oils became heavily marketed and hence popular, which replaced butter and lard. Also, muckraking about the meat industry scared Americans off from meat and consumption precipitiously dropped. As such, in the decades prior to World War II, the American diet had already shifted toward what we now know. A new young generation had grown up on that industrialized and processed diet and those young people were the ones showing up as recruits for the military. This new diet in such a short period had caused mass malnourishment. It was a mass experiment that showed failure early on and yet we continue the same basic experiment, not only continuing it but making it far worse.

Government officials and health authorities blamed it on bread production. Refined flour had become widely available because of industrialization. This removed all the nutrients that gave any health value to bread. In response, there was a movement to fortify bread, initially enforced by federal law and later by state laws. That helped some, but obviously the malnourishment was caused by many other factors that weren’t appreciated by most at the time, even though this was the same period when Weston A. Price’s work was published. Nutritional science was young at the time and very most nutrients were still undiscovered or else unappreciated. Throwing a few lab-produced vitamins back into food barely scratches the surface of the nutrient-density that was lost.

Most Americans continue to have severe nutritional deficiencies. We don’t recognize this fact because being underdeveloped and sickly has become normalized, maybe even in the minds of most doctors and health officials. Besides, many of the worst symptoms don’t show up until decades later, often as chronic diseases of old age, although increasingly seen among the young. Far fewer Americans today would meet the health standards of World War recruits. It’s been a steady decline, despite the miracles of modern medicine in treating symptoms and delaying death.

* The data on the British shows an even earlier shift in malnourishment because imperial trade brought an industrialized diet sooner to the British population. Also, rural life with a greater diet of wild foods had more quickly disappeared, as compared to the US. The fate of the British in the late 1800s showed what would later happen more than a half century later on the other side of the ocean.

Lore of Nutrition
by Tim Noakes
pp. 373-375

The mid-Victorian period between 1850 and 1880 is now recognised as the golden era of British health. According to P. Clayton and J. Rowbotham, 47 this was entirely due to the mid-Victorians’ superior diet. Farm-produced real foods were available in such surplus that even the working-class poor were eating highly nutritious foods in abundance. As a result, life expectancy in 1875 was equal to, or even better, than it is in modern Britain, especially for men (by about three years). In addition, the profile of diseases was quite different when compared to Britain today.

The authors conclude:

[This] shows that medical advances allied to the pharmaceutical industry’s output have done little more than change the manner of our dying. The Victorians died rapidly of infection and/or trauma, whereas we die slowly of degenerative disease. It reveals that with the exception of family planning, the vast edifice of twentieth century healthcare has not enabled us to live longer but has in the main merely supplied methods of suppressing the symptoms of degenerative disease which have emerged due to our failure to maintain mid-Victorian nutritional standards. 48

This mid-Victorians’ healthy diet included freely available and cheap vegetables such as onions, carrots, turnips, cabbage, broccoli, peas and beans; fresh and dried fruit, including apples; legumes and nuts, especially chestnuts, walnuts and hazelnuts; fish, including herring, haddock and John Dory; other seafood, including oysters, mussels and whelks; meat – which was considered ‘a mark of a good diet’ so that ‘its complete absence was rare’ – sourced from free-range animals, especially pork, and including offal such as brain, heart, pancreas (sweet breads), liver, kidneys, lungs and intestine; eggs from hens that were kept by most urban households; and hard cheeses.

Their healthy diet was therefore low in cereals, grains, sugar, trans fats and refined flour, and high in fibre, phytonutrients and omega- 3 polyunsaturated fatty acids, entirely compatible with the modern Paleo or LCHF diets.

This period of nutritional paradise changed suddenly after 1875 , when cheap imports of white flour, tinned meat, sugar, canned fruits and condensed milk became more readily available. The results were immediately noticeable. By 1883 , the British infantry was forced to lower its minimum height for recruits by three inches; and by 1900, 50 per cent of British volunteers for the Boer War had to be rejected because of undernutrition. The changes would have been associated with an alteration in disease patterns in these populations, as described by Yellowlees ( Chapter 2 ).

On Obesity and Malnourishment

There is no contradiction, by the way, between rampant nutritional deficiencies and the epidemic of obesity. Gary Taubes noted the dramatic rise of obesity in America began earlier last century, which is to say that it is not a problem that came out of nowhere with the present younger generations. Americans have been getting fatter for a while now. Specifically, they were getting fatter while at the same time being malnourished, partly because of refined flour that was as empty of a carb that is possible.

Taubes emphasizes the point that this seeming paradox has often been observed among poor populations around the world, lack of optimal nutrition that leads to ever more weight gain, sometimes with children being skinny to an unhealthy degree only to grow up to be fat. No doubt that many Americans in the early 1900s were dealing with much poverty and the lack of nutritious foods that often goes with it. As for today, nutritional deficiencies are different because of enrichment, but it persists nonetheless in many other ways. Also, as Keith Payne argues in The Broken Ladder, growing inequality mimics poverty in the conflict and stress it causes. And inequality has everything to do with food quality, as seen with many poor areas being food deserts.

I’ll give you a small taste of Taube’s discussion. It is from the introduction to one of his books, published a few years ago. If you read the book, look at the section immediately following the below. He gives examples of tribes that were poor, didn’t overeat, and did hard manual labor. Yet they were getting obese, even as nearby tribes sometimes remained a healthy weight. The only apparent difference was what they were eating and not how much they were eating. The populations that saw major weight gain had adopted a grain-based diet, typically because of government rations or government stores.

Why We Get Fat
by Gary Taubes
pp. 17-19

In 1934, a young German pediatrician named Hilde Bruch moved to America, settled in New York City, and was “startled,” as she later wrote, by the number of fat children she saw—“really fat ones, not only in clinics, but on the streets and subways, and in schools.” Indeed, fat children in New York were so conspicuous that other European immigrants would ask Bruch about it, assuming that she would have an answer. What is the matter with American children? they would ask. Why are they so bloated and blown up? Many would say they’d never seen so many children in such a state.

Today we hear such questions all the time, or we ask them ourselves, with the continual reminders that we are in the midst of an epidemic of obesity (as is the entire developed world). Similar questions are asked about fat adults. Why are they so bloated and blown up? Or you might ask yourself: Why am I?

But this was New York City in the mid-1930s. This was two decades before the first Kentucky Fried Chicken and McDonald’s franchises, when fast food as we know it today was born. This was half a century before supersizing and high-fructose corn syrup. More to the point, 1934 was the depths of the Great Depression, an era of soup kitchens, bread lines, and unprecedented unemployment. One in every four workers in the United States was unemployed. Six out of every ten Americans were living in poverty. In New York City, where Bruch and her fellow immigrants were astonished by the adiposity of the local children, one in four children were said to be malnourished. How could this be?

A year after arriving in New York, Bruch established a clinic at Columbia University’s College of Physicians and Surgeons to treat obese children. In 1939, she published the first of a series of reports on her exhaustive studies of the many obese children she had treated, although almost invariably without success. From interviews with her patients and their families, she learned that these obese children did indeed eat excessive amounts of food—no matter how much either they or their parents might initially deny it. Telling them to eat less, though, just didn’t work, and no amount of instruction or compassion, counseling, or exhortations—of either children or parents—seemed to help.

It was hard to avoid, Bruch said, the simple fact that these children had, after all, spent their entire lives trying to eat in moderation and so control their weight, or at least thinking about eating less than they did, and yet they remained obese. Some of these children, Bruch reported, “made strenuous efforts to lose weight, practically giving up on living to achieve it.” But maintaining a lower weight involved “living on a continuous semi-starvation diet,” and they just couldn’t do it, even though obesity made them miserable and social outcasts.

One of Bruch’s patients was a fine-boned girl in her teens, “literally disappearing in mountains of fat.” This young girl had spent her life fighting both her weight and her parents’ attempts to help her slim down. She knew what she had to do, or so she believed, as did her parents—she had to eat less—and the struggle to do this defined her existence. “I always knew that life depended on your figure,” she told Bruch. “I was always unhappy and depressed when gaining [weight]. There was nothing to live for.… I actually hated myself. I just could not stand it. I didn’t want to look at myself. I hated mirrors. They showed how fat I was.… It never made me feel happy to eat and get fat—but I never could see a solution for it and so I kept on getting fatter.”

pp. 33-34

If we look in the literature—which the experts have not in this case—we can find numerous populations that experienced levels of obesity similar to those in the United States, Europe, and elsewhere today but with no prosperity and few, if any, of the ingredients of Brownell’s toxic environment: no cheeseburgers, soft drinks, or cheese curls, no drive-in windows, computers, or televisions (sometimes not even books, other than perhaps the Bible), and no overprotective mothers keeping their children from roaming free.

In these populations, incomes weren’t rising; there were no labor-saving devices, no shifts toward less physically demanding work or more passive leisure pursuits. Rather, some of these populations were poor beyond our ability to imagine today. Dirt poor. These are the populations that the overeating hypothesis tells us should be as lean as can be, and yet they were not.

Remember Hilde Bruch’s wondering about all those really fat children in the midst of the Great Depression? Well, this kind of observation isn’t nearly as unusual as we might think.

How Americans Used to Eat

Below is a relevant passage. It puts into context how extremely unusual has been the high-carb, low-fat diet these past few generations. This is partly what informed some of my thoughts. We so quickly forget that the present dominance of a grain-based diet wasn’t always the case, likely not even in most agricultural societies until quite recently. In fact, the earlier American diet is still within living memory, although those left to remember it are quickly dying off.

Let me explain why history of diets matter. One of the arguments for forcing official dietary recommendations onto the entire population was the belief that Americans in a mythical past ate less meat, fat, and butter while having ate more bread, legumes, and vegetables. This turns out to have been a trick of limited data.

We now know, from better data, that the complete opposite was the case. And we have the further data that shows that the increase of the conventional diet has coincided with increase of obesity and chronic diseases. That isn’t to say eating more vegetables is bad for your health, but we do know that even as the average American intake of vegetables has gone up so has all the diet-related health conditions. During this time, what went down was the consumption of all the traditional foods of the American diet going back to the colonial era: wild game, red meat, organ meat, lard, and butter — all the foods Americans ate in huge amounts prior to the industrialized diet.

What added to the confusion and misinterpretation of the evidence had to do with timing. Diet and nutrition was first seriously studied right at the moment when, for most populations, it had already changed. That was the failure of Ancel Keys research on what came to be called the Mediterranean diet (see Sally Fallon Morrell’s Nourishing Diets). The population was recuperating from World War II that had devastated their traditional way of life, including their diet. Keys took the post-war deprivation diet as being the historical norm, but the reality was far different. Cookbooks and other evidence from before the war showed that this population used to eat higher levels of meat and fat, including saturated fat. So, the very people focused on had grown up and spent most of their lives on a diet that was at the moment no longer available because of disruption of the food system. What good health Keys observed came from a lifetime of eating a different diet. Combined with cherry-picking of data and biased analysis, Keys came to a conclusion that was as wrong as wrong could be.

Slightly earlier, Weston A. Price was able to see a different picture. He intentionally traveled to the places where traditional diets remained fully in place. And the devastation of World War II had yet to happen. Price came to a conclusion that what mattered most of all was nutrient-density. Sure, the vegetables eaten would have been of a higher quality than we get today, largely because they were heirloom cultivars grown on health soil. Nutrient-dense foods can only come from nutrient-dense soil, whereas today our food is nutrient-deficient because our soil is highly depleted. The same goes for animal foods. Animals pastured on healthy land will produce healthy dairy, eggs, meat, and fat; these foods will be high in omega-3s and the fat-soluble vitamins.

No matter if it is coming from plant sources or animal sources, nutrient-density might be the most important factor of all. Why fat is meaningful in this context is that it is fat that is where fat-soluble vitamins are found and it is through fat that they are metabolized. And in turn, the fat-soluble vitamins play a key role in the absorption and processing of numerous other nutrients, not to mention a key role in numerous functions in the body. Nutrient-density and fat-density go hand in hand in terms of general health. That is what early Americans were getting in eating so much wild food, not only wild game but also wild greens, fruit, and mushrooms. And nutrient-density is precisely what we are lacking today, as the nutrients have been intentionally removed to make more palatable commercial foods.

Once again, this has a class dimension, since the wealthier have more access to nutrient-dense foods. Few poor people could afford to shop at a high-end health food store, even if one was located nearby their home. But it was quite different in the past when nutrient-dense foods were available to everyone and sometimes more available to the poor concentrated in rural areas. If we want to improve public health, the first thing we should do is return to this historical norm.

The Big Fat Surprise
by Nina Teicholz
pp. 123-131

Yet despite this shaky and often contradictory evidence, the idea that red meat is a principal dietary culprit has thoroughly pervaded our national conversation for decades. We have been led to believe that we’ve strayed from a more perfect, less meat-filled past. Most prominently, when Senator McGovern announced his Senate committee’s report, called Dietary Goals , at a press conference in 1977, he expressed a gloomy outlook about where the American diet was heading. “Our diets have changed radically within the past fifty years,” he explained, “with great and often harmful effects on our health.” Hegsted, standing at his side, criticized the current American diet as being excessively “rich in meat” and other sources of saturated fat and cholesterol, which were “linked to heart disease, certain forms of cancer, diabetes and obesity.” These were the “killer diseases,” said McGovern. The solution, he declared, was for Americans to return to the healthier, plant-based diet they once ate.

The New York Times health columnist Jane Brody perfectly encapsulated this idea when she wrote, “Within this century, the diet of the average American has undergone a radical shift away from plant-based foods such as grains, beans and peas, nuts, potatoes, and other vegetables and fruits and toward foods derived from animals—meat, fish, poultry, eggs and dairy products.” It is a view that has been echoed in literally hundreds of official reports.

The justification for this idea, that our ancestors lived mainly on fruits, vegetables, and grains, comes mainly from the USDA “food disappearance data.” The “disappearance” of food is an approximation of supply; most of it is probably being eaten, but much is wasted, too. Experts therefore acknowledge that the disappearance numbers are merely rough estimates of consumption. The data from the early 1900s, which is what Brody, McGovern, and others used, are known to be especially poor. Among other things, these data accounted only for the meat, dairy, and other fresh foods shipped across state lines in those early years, so anything produced and eaten locally, such as meat from a cow or eggs from chickens, would not have been included. And since farmers made up more than a quarter of all workers during these years, local foods must have amounted to quite a lot. Experts agree that this early availability data are not adequate for serious use, yet they cite the numbers anyway, because no other data are available. And for the years before 1900, there are no “scientific” data at all.

In the absence of scientific data, history can provide a picture of food consumption in the late eighteenth to nineteenth century in America. Although circumstantial, historical evidence can also be rigorous and, in this case, is certainly more far-reaching than the inchoate data from the USDA. Academic nutrition experts rarely consult historical texts, considering them to occupy a separate academic silo with little to offer the study of diet and health. Yet history can teach us a great deal about how humans used to eat in the thousands of years before heart disease, diabetes, and obesity became common. Of course we don’t remember now, but these diseases did not always rage as they do today. And looking at the food patterns of our relatively healthy early-American ancestors, it’s quite clear that they ate far more red meat and far fewer vegetables than we have commonly assumed.

Early-American settlers were “indifferent” farmers, according to many accounts. They were fairly lazy in their efforts at both animal husbandry and agriculture, with “the grain fields, the meadows, the forests, the cattle, etc, treated with equal carelessness,” as one eighteenth-century Swedish visitor described. And there was little point in farming since meat was so readily available.

The endless bounty of America in its early years is truly astonishing. Settlers recorded the extraordinary abundance of wild turkeys, ducks, grouse, pheasant, and more. Migrating flocks of birds would darken the skies for days . The tasty Eskimo curlew was apparently so fat that it would burst upon falling to the earth, covering the ground with a sort of fatty meat paste. (New Englanders called this now-extinct species the “doughbird.”)

In the woods, there were bears (prized for their fat), raccoons, bobolinks, opossums, hares, and virtual thickets of deer—so much that the colonists didn’t even bother hunting elk, moose, or bison, since hauling and conserving so much meat was considered too great an effort. IX

A European traveler describing his visit to a Southern plantation noted that the food included beef, veal, mutton, venison, turkeys, and geese, but he does not mention a single vegetable. Infants were fed beef even before their teeth had grown in. The English novelist Anthony Trollope reported, during a trip to the United States in 1861, that Americans ate twice as much beef as did Englishmen. Charles Dickens, when he visited, wrote that “no breakfast was breakfast” without a T-bone steak. Apparently, starting a day on puffed wheat and low-fat milk—our “Breakfast of Champions!”—would not have been considered adequate even for a servant.

Indeed, for the first 250 years of American history, even the poor in the United States could afford meat or fish for every meal. The fact that the workers had so much access to meat was precisely why observers regarded the diet of the New World to be superior to that of the Old. “I hold a family to be in a desperate way when the mother can see the bottom of the pork barrel,” says a frontier housewife in James Fenimore Cooper’s novel The Chainbearer.

Like the primitive tribes mentioned in Chapter 1, Americans also relished the viscera of the animal, according to the cookbooks of the time. They ate the heart, kidneys, tripe, calf sweetbreads (glands), pig’s liver, turtle lungs, the heads and feet of lamb and pigs, and lamb tongue. Beef tongue, too, was “highly esteemed.”

And not just meat but saturated fats of every kind were consumed in great quantities. Americans in the nineteenth century ate four to five times more butter than we do today, and at least six times more lard. X

In the book Putting Meat on the American Table , researcher Roger Horowitz scours the literature for data on how much meat Americans actually ate. A survey of eight thousand urban Americans in 1909 showed that the poorest among them ate 136 pounds a year, and the wealthiest more than 200 pounds. A food budget published in the New York Tribune in 1851 allots two pounds of meat per day for a family of five. Even slaves at the turn of the eighteenth century were allocated an average of 150 pounds of meat a year. As Horowitz concludes, “These sources do give us some confidence in suggesting an average annual consumption of 150–200 pounds of meat per person in the nineteenth century.”

About 175 pounds of meat per person per year! Compare that to the roughly 100 pounds of meat per year that an average adult American eats today. And of that 100 pounds of meat, more than half is poultry—chicken and turkey—whereas until the mid-twentieth century, chicken was considered a luxury meat, on the menu only for special occasions (chickens were valued mainly for their eggs). Subtracting out the poultry factor, we are left with the conclusion that per capita consumption of red meat today is about 40 to 70 pounds per person, according to different sources of government data—in any case far less than what it was a couple of centuries ago.

Yet this drop in red meat consumption is the exact opposite of the picture we get from public authorities. A recent USDA report says that our consumption of meat is at a “record high,” and this impression is repeated in the media. It implies that our health problems are associated with this rise in meat consumption, but these analyses are misleading because they lump together red meat and chicken into one category to show the growth of meat eating overall, when it’s just the chicken consumption that has gone up astronomically since the 1970s. The wider-lens picture is clearly that we eat far less red meat today than did our forefathers.

Meanwhile, also contrary to our common impression, early Americans appeared to eat few vegetables. Leafy greens had short growing seasons and were ultimately considered not worth the effort. They “appeared to yield so little nutriment in proportion to labor spent in cultivation,” wrote one eighteenth-century observer, that “farmers preferred more hearty foods.” Indeed, a pioneering 1888 report for the US government written by the country’s top nutrition professor at the time concluded that Americans living wisely and economically would be best to “avoid leafy vegetables,” because they provided so little nutritional content. In New England, few farmers even had many fruit trees, because preserving fruits required equal amounts of sugar to fruit, which was far too costly. Apples were an exception, and even these, stored in barrels, lasted several months at most.

It seems obvious, when one stops to think, that before large supermarket chains started importing kiwis from New Zealand and avocados from Israel, a regular supply of fruits and vegetables could hardly have been possible in America outside the growing season. In New England, that season runs from June through October or maybe, in a lucky year, November. Before refrigerated trucks and ships allowed the transport of fresh produce all over the world, most people could therefore eat fresh fruit and vegetables for less than half the year; farther north, winter lasted even longer. Even in the warmer months, fruit and salad were avoided, for fear of cholera. (Only with the Civil War did the canning industry flourish, and then only for a handful of vegetables, the most common of which were sweet corn, tomatoes, and peas.)

Thus it would be “incorrect to describe Americans as great eaters of either [fruits or vegetables],” wrote the historians Waverly Root and Richard de Rochemont. Although a vegetarian movement did establish itself in the United States by 1870, the general mistrust of these fresh foods, which spoiled so easily and could carry disease, did not dissipate until after World War I, with the advent of the home refrigerator.

So by these accounts, for the first two hundred and fifty years of American history, the entire nation would have earned a failing grade according to our modern mainstream nutritional advice.

During all this time, however, heart disease was almost certainly rare. Reliable data from death certificates is not available, but other sources of information make a persuasive case against the widespread appearance of the disease before the early 1920s. Austin Flint, the most authoritative expert on heart disease in the United States, scoured the country for reports of heart abnormalities in the mid-1800s, yet reported that he had seen very few cases, despite running a busy practice in New York City. Nor did William Osler, one of the founding professors of Johns Hopkins Hospital, report any cases of heart disease during the 1870s and eighties when working at Montreal General Hospital. The first clinical description of coronary thrombosis came in 1912, and an authoritative textbook in 1915, Diseases of the Arteries including Angina Pectoris , makes no mention at all of coronary thrombosis. On the eve of World War I, the young Paul Dudley White, who later became President Eisenhower’s doctor, wrote that of his seven hundred male patients at Massachusetts General Hospital, only four reported chest pain, “even though there were plenty of them over 60 years of age then.” XI About one fifth of the US population was over fifty years old in 1900. This number would seem to refute the familiar argument that people formerly didn’t live long enough for heart disease to emerge as an observable problem. Simply put, there were some ten million Americans of a prime age for having a heart attack at the turn of the twentieth century, but heart attacks appeared not to have been a common problem.

Was it possible that heart disease existed but was somehow overlooked? The medical historian Leon Michaels compared the record on chest pain with that of two other medical conditions, gout and migraine, which are also painful and episodic and therefore should have been observed by doctors to an equal degree. Michaels catalogs the detailed descriptions of migraines dating all the way back to antiquity; gout, too, was the subject of lengthy notes by doctors and patients alike. Yet chest pain is not mentioned. Michaels therefore finds it “particularly unlikely” that angina pectoris, with its severe, terrifying pain continuing episodically for many years, could have gone unnoticed by the medical community, “if indeed it had been anything but exceedingly rare before the mid-eighteenth century.” XII

So it seems fair to say that at the height of the meat-and-butter-gorging eighteenth and nineteenth centuries, heart disease did not rage as it did by the 1930s. XIII

Ironically—or perhaps tellingly—the heart disease “epidemic” began after a period of exceptionally reduced meat eating. The publication of The Jungle , Upton Sinclair’s fictionalized exposé of the meatpacking industry, caused meat sales in the United States to fall by half in 1906, and they did not revive for another twenty years. In other words, meat eating went down just before coronary disease took off. Fat intake did rise during those years, from 1909 to 1961, when heart attacks surged, but this 12 percent increase in fat consumption was not due to a rise in animal fat. It was instead owing to an increase in the supply of vegetable oils, which had recently been invented.

Nevertheless, the idea that Americans once ate little meat and “mostly plants”—espoused by McGovern and a multitude of experts—continues to endure. And Americans have for decades now been instructed to go back to this earlier, “healthier” diet that seems, upon examination, never to have existed.