There has been much debate about the causes of autism, from genetics to diet. Many have suspected a link to heavy metals. According to a 2017 NIH study, strong evidence of this link has been found in the baby teeth of children with autism, indicating early life lead exposure (see below).
The question is why would autism rates be increasing if lead toxicity rates are not increasing. One thing to keep in mind that, though lead pollution has declined, the environment remains filled with lead and other heavy metals — in the soil, paint, and pipes. Lead exposure still is extremely common and even low doses can be damaging.
That brings us to a recent congressional investigation released a couple of days ago (see below). Most of us may not be breathing more lead pollution and paint dust, or drinking more lead in our water. But we might still be getting excessive levels of lead in our food. The congressional investigation specifically found toxic levels in nearly all baby food.
What is uncertain is if this represents some kind of change. Has there been a change in farming practices or a change somewhere else in the food supply that is increasing heavy metal concentration? Or is it some combination of other factors that is somehow worsening the effect of already present heavy metals accumulated in the soil?
Basically, why does the autism rate appear to be on the rise? That is a mystery, if we invoke lead toxicity as the central cause. Overall, lead toxicity rates have been on a decline, compared to the heavy toll of lead toxicity that spiked with the childhood of GenXers and young Boomers during the 1960s and 1970s, prior to environmental regulations.
Looking at baby teeth and baby food might help to grasp the key factor. It’s not about lead exposure in general but at a specific period of development. What we need to be looking at is the lead toxicity rates of babies and pregnant mothers, but such testing is not standard. Children typically are only tested after, not before, they show health and developmental problems.
So, it’s possible that, even though there is less lead exposure on average across childhood and adulthood, lead exposure in infancy might have gone up. This could be caused, for example, by increasing import of baby food from countries with weak environmental regulations and more heavy chemical use in farming.
This is concerning, as the long-term affects of heavy metal toxicity are diverse and sometimes devastating — besides autism: behavioral issues, impulse control issues, aggression, lowered IQ, etc; along with physical health problems. We might be seeing another generation or two of lead toxicity damage, exacerbated by the poor communities still struggling with already high rates of lead toxicity from old housing and industrial residue.
Baby teeth from children with autism contain more toxic lead and less of the essential nutrients zinc and manganese, compared to teeth from children without autism, according to an innovative study funded by the National Institute of Environmental Health Sciences (NIEHS), part of the National Institutes of Health. The researchers studied twins to control genetic influences and focus on possible environmental contributors to the disease. The findings, published June 1 in the journal Nature Communications, suggest that differences in early-life exposure to metals, or more importantly how a child’s body processes them, may affect the risk of autism.
The differences in metal uptake between children with and without autism were especially notable during the months just before and after the children were born. The scientists determined this by using lasers to map the growth rings in baby teeth generated during different developmental periods.
The researchers observed higher levels of lead in children with autism throughout development, with the greatest disparity observed during the period following birth.
Four leading baby food manufacturers knowingly sold baby food that contained high levels of toxic heavy metals, according to internal company documents included in a congressional investigation released Thursday.
“Dangerous levels of toxic metals like arsenic, lead, cadmium and mercury exist in baby foods at levels that exceed what experts and governing bodies say are permissible,” said Democratic Rep. Raja Krishnamoorthi of Illinois, chair of the House Subcommittee on Economic and Consumer Policy, which conducted the investigation, signed by the Democratic members.
Krishnamoorthi said the spreadsheets provided by manufacturers are “shocking” because they show evidence that some baby foods contain hundreds of parts per billion of dangerous metals. “Yet we know that in a lot of cases, we should not have anything more than single digit parts per billion of any of these metals in any of our foods,” he told CNN.
As natural elements, they are in the soil in which crops are grown and thus can’t be avoided. Some crop fields and regions, however, contain more toxic levels than others, partly due to the overuse of metal-containing pesticides and ongoing industrial pollution.
“There was a time where we used metals as the predominant pesticide for many years, assuming it was safe,” said Dr. Leonardo Trasande, chief of environmental pediatrics at NYU Langone.
The US Food and Drug Administration has not yet set minimum levels for heavy metals in most infant food. The agency did set a standard of 100 parts per billion inorganic arsenic for infant rice cereal, but even that level is considered much too high for baby’s safety, critics say, especially since the FDA has already set a much lower standard of 10 parts per billion of inorganic arsenic for bottled water.
From the time of conception through the age of 2, babies have an extremely high sensitivity to neurotoxic chemicals, said Jane Houlihan, the national director of science and health for Healthy Babies Bright Futures, a coalition of advocates committed to reducing babies’ exposures to neurotoxic chemicals.
“Their brain is forming rapidly, and so when they’re exposed to metals that can interrupt those natural processes, the impacts range from behavioral problems to aggression to IQ loss and all kinds of cognitive and behavioral deficits that can persist throughout life,” Houlihan said.
“Pound for pound, babies get the highest dose of these heavy metals compared to other parts of the population,” she said. “So the consequences are serious.”
Healthy Babies Bright Futures published a report in 2019 that found toxic metals in 95% of the baby foods randomly pulled off supermarket shelves and tested — that exposé was the “inspiration” for the subcommittee’s work, Krishnamoorthi told CNN.
There are multiple folktales about the tender senses of royalty, aristocrats, and other elite. The most well known example is “The Princess and the Pea”. In the Aarne-Thompson-Uther system of folktale categorization, it gets listed as type 704 about the search for a sensitive wife. That isn’t to say that all the narrative variants of elite sensitivity involve potential wives. Anyway, the man who made this particular story famous is Hans Christian Andersen, having published his translation in 1835. He longed to be a part of the respectable class, but felt excluded. Some speculate that he projected his own class issues onto his slightly altered version of the folktale, something discussed in the Wikipedia article about the story:
“Wullschlager observes that in “The Princess and the Pea” Andersen blended his childhood memories of a primitive world of violence, death and inexorable fate, with his social climber’s private romance about the serene, secure and cultivated Danish bourgeoisie, which did not quite accept him as one of their own. Researcher Jack Zipes said that Andersen, during his lifetime, “was obliged to act as a dominated subject within the dominant social circles despite his fame and recognition as a writer”; Andersen therefore developed a feared and loved view of the aristocracy. Others have said that Andersen constantly felt as though he did not belong, and longed to be a part of the upper class.[11] The nervousness and humiliations Andersen suffered in the presence of the bourgeoisie were mythologized by the storyteller in the tale of “The Princess and the Pea”, with Andersen himself the morbidly sensitive princess who can feel a pea through 20 mattresses.[12]Maria Tatar notes that, unlike the folk heroine of his source material for the story, Andersen’s princess has no need to resort to deceit to establish her identity; her sensitivity is enough to validate her nobility. For Andersen, she indicates, “true” nobility derived not from an individual’s birth but from their sensitivity. Andersen’s insistence upon sensitivity as the exclusive privilege of nobility challenges modern notions about character and social worth. The princess’s sensitivity, however, may be a metaphor for her depth of feeling and compassion.[1][…] Researcher Jack Zipes notes that the tale is told tongue-in-cheek, with Andersen poking fun at the “curious and ridiculous” measures taken by the nobility to establish the value of bloodlines. He also notes that the author makes a case for sensitivity being the decisive factor in determining royal authenticity and that Andersen “never tired of glorifying the sensitive nature of an elite class of people”.[15]”
Even if that is true, there is more going on here than some guy working out his personal issues through fiction. This princess’ sensory sensitivity sounds like autism spectrum disorder and I have a theory about that. Autism has been associated with certain foods like wheat, specifically refined flour in highly processed foods (The Agricultural Mind). And a high-carb diet in general causes numerous neurocognitive problems (Ketogenic Diet and Neurocognitive Health), along with other health conditions such as metabolic syndrome (Dietary Dogma: Tested and Failed) and insulin resistance (Coping Mechanisms of Health), atherosclerosis (Ancient Atherosclerosis?) and scurvy (Sailors’ Rations, a High-Carb Diet) — by the way, the rates of these diseases have been increasing over the generations and often first appearing among the affluent. Sure, grains have long been part of the diet, but the one grain that had most been associated with the wealthy going back millennia was wheat, as it was harder to grow which caused it to be in short supply and so expensive. Indeed, it is wheat, not the other grains, that gets brought up in relation to autism. This is largely because of gluten, though other things have been pointed to.
It is relevant that the historical period in which these stories were written down was around when the first large grain surpluses were becoming common and so bread, white bread most of all, became a greater part of the diet. But as part of the diet, this was first seen among the upper classes. It’s too bad we don’t have cross-generational data on autism rates in terms of demographic and dietary breakdown, but it is interesting to note that the mental health condition neurasthenia, also involving sensitivity, from the 19th century was seen as a disease of the middle-to-upper class (The Crisis of Identity), and this notion of the elite as sensitive was a romanticized ideal going back to the 1700s with what Jane Austen referred to as ‘sensibility’ (see Bryan Kozlowski’s The Jane Austen Diet, as quoted in the link immediately above). In that same historical period, others noted that schizophrenia was spreading along with civilization (e.g., Samuel Gridley Howe and Henry Maudsley; see The Invisible Plague by Edwin Fuller Torrey & Judy Miller) and I’d add the point that there appear to be some overlapping factors between schizophrenia and autism — besides gluten, some of the implicated factors are glutamate, exorphins, inflammation, etc. “It is unlikely,” writes William Davis, “that wheat exposure was the initial cause of autism or ADHD but, as with schizophrenia, wheat appears to be associated with worsening characteristics of the conditions” (Wheat Belly, p. 48).
For most of human history, crop failures and famine were a regular occurrence. And this most harshly affected the poor masses when grain and bread prices went up, leading to food riots and sometimes revolutions (e.g., French Revolution). Before the 1800s, grains were so expensive that, in order to make them affordable, breads were often adulterated with fillers or entirely replaced with grain substitutes, the latter referred to as “famine breads” and sometimes made with tree bark. Even when available, the average person might be spending most of their money on bread, as it was one of the most costly foods around and other foods weren’t always easily obtained.
Even so, grain being highly sought after certainly doesn’t imply that the average person was eating a high-carb diet, quite the opposite (A Common Diet). Food in general was expensive and scarce and, among grains, wheat was the least common. At times, this would have forced feudal peasants and later landless peasants onto a diet limited in both carbohydrates and calories, which would have meant a typically ketogenic state (Fasting, Calorie Restriction, and Ketosis), albeit far from an optimal way of achieving it. The further back in time one looks the greater prevalence would have been ketosis (e.g., Spartan and Mongol diet), maybe with the exception of the ancient Egyptians (Ancient Atherosclerosis?). In places like Ireland, Russia, etc, the lower classes remained on this poverty diet that was often a starvation diet well into the mid-to-late 1800s, although in the case of the Irish it was an artificially constructed famine as the potato crop was essentially being stolen by the English and sold on the international market.
Yet, in America, the poor were fortunate in being able to rely on a meat-based diet because wild game was widely available and easily obtained, even in cities. That may have been true for many European populations as well during earlier feudalism, specifically prior to the peasants being restricted in hunting and trapping on the commons. This is demonstrated by how health improved after the fall of the Roman Empire (Malnourished Americans). During this earlier period, only the wealthy could afford high-quality bread and large amounts of grain-based foods in general. That meant highly refined and fluffy white bread that couldn’t easily be adulterated. Likewise, for the early centuries of colonialism, sugar was only available to the wealthy — in fact, it was a controlled substance typically only found in pharmacies. But for the elite who had access, sugary pastries and other starchy dessert foods became popular. White bread and pastries were status symbols. Sugar was so scarce that wealthy households kept it locked away so the servants couldn’t steal it. Even fruit was disproportionately eaten by the wealthy. A fruit pie would truly have been a luxury with all three above ingredients combined in a single delicacy.
Part of the context is that, although grain yields had been increasing during the early colonial era, there weren’t dependable surplus yields of grains before the 1800s. Until then, white bread, pastries, and such simply were not affordable to most people. Consumption of grains, along with other starchy carbs and sugar, rose with 19th century advancements in agriculture. Simultaneously, income was increasing and the middle class was growing. But even as yields increased, most of the created surplus grains went to feeding livestock, not to feeding the poor. Grains were perceived as cattle feed. Protein consumption increased more than did carbohydrate consumption, at least initially. The American population, in particular, didn’t see the development of a high-carb diet until much later, as related to US mass urbanization also happening later.
Coming to the end of the 19th century, there was the emergence of the mass diet of starchy and sugary foods, especially the spread of wheat farming and white bread. And, in the US, only by the 20th century did grain consumption finally surpass meat consumption. Following that, there has been growing rates of autism. Along with sensory sensitivity, autistics are well known for their pickiness about foods and well known for cravings for particular foods such as those made from highly refined wheat flour, from white bread to crackers. Yet the folktales in question were speaking to a still living memory of an earlier time when these changes had yet to happen. Hans Christian Andersen first published “The Princess and the Pea” in 1835, but such stories had been orally told long before that, probably going back at least centuries, although we now know that some of these folktales have their origins millennia earlier, even into the Bronze Age. According to the Wikipedia article on “The Princess and the Pea”,
“The theme of this fairy tale is a repeat of that of the medieval Perso-Arabic legend of al-Nadirah.[6] […] Tales of extreme sensitivity are infrequent in world culture but a few have been recorded. As early as the 1st century, Seneca the Younger had mentioned a legend about a Sybaris native who slept on a bed of roses and suffered due to one petal folding over.[23] The 11th-century Kathasaritsagara by Somadeva tells of a young man who claims to be especially fastidious about beds. After sleeping in a bed on top of seven mattresses and newly made with clean sheets, the young man rises in great pain. A crooked red mark is discovered on his body and upon investigation a hair is found on the bottom-most mattress of the bed.[5] An Italian tale called “The Most Sensitive Woman” tells of a woman whose foot is bandaged after a jasmine petal falls upon it.”
I would take it as telling that, in the case of this particular folktale, it doesn’t appear to be as ancient as other examples. That would support my argument that the sensory sensitivity of autism might be caused by greater consumption of refined wheat, something that only began to appear late in the Axial Age and only became common much later. Even for the few wealthy that did have access in ancient times, they were eating rather limited amounts of white bread. It might have required hitting a certain level of intake, not seen until modernity or closer to it, before the extreme autistic symptoms became noticeable among a larger number of the aristocracy and monarchy.
Did you know where the term refined comes from? Around 1826, whole grain bread used by the military was called superior for health versus the white refined bread used by the aristocracy. Before the industrial revolution, it was more labor consuming and more expensive to refine bread, so white bread was the main staple loaf for aristocracy. That’s why it was called “refined”.
Bread has always been political. For Romans, it helped define class; white bread was for aristocrats, while the darkest brown loaves were for the poor. Later, Jacobin radicals claimed white bread for the masses, while bread riots have been a perennial theme of populist uprisings. But the political meaning of the staff of life changed dramatically in the early twentieth-century United States, as Aaron Bobrow-Strain, who went on to write the book White Bread, explained in a 2007 paper. […]
Even before this industrialization of baking, white flour had had its critics, like cracker inventor William Sylvester Graham. Now, dietary experts warned that white bread was, in the words of one doctor, “so clean a meal worm can’t live on it for want of nourishment.” Or, as doctor and radio host P.L. Clark told his audience, “the whiter your bread, the sooner you’re dead.”
Furthermore, one should not disregard the cultural context of food consumption. Habits may develop that prevent the attainment of a level of nutritional status commensurate with actual real income. For instance, the consumption of white bread or of polished rice, instead of whole-wheat bread or unpolished rice, might increase with income, but might detract from the body’s well-being. Insofar as cultural habits change gradually over time, significant lags could develop between income and nutritional status.
pp. 192-194
As consequence, per capita food consumption could have increased between 1660 and 1740 by as much as 50 percent. The fact that real wages were higher in the 1730s than at any time since 1537 indicates a high standard of living was reached. The increase in grain exports, from 2.8 million quintals in the first decade of the eighteenth century to 6 million by the 1740s, is also indicative of the availability of nutrients.
The remarkably good harvests were brought about by the favorable weather conditions of the 1730s. In England the first four decades of the eighteenth century were much warmer than the last decades of the previous century (Table 5.1). Even small differences in temperature may have important consequences for production. […] As a consequence of high yields the price of consumables declined by 14 percent in the 1730s relative to the 1720s. Wheat cost 30 percent less in the 1730s than it did in the 1660s. […] The increase in wheat consumption was particularly important because wheat was less susceptible to mold than rye. […]
There is direct evidence that the nutritional status of many populations was, indeed, improving in the early part of the eighteenth century, because human stature was generally increasing in Europe as well as in America (see Chapter 2). This is a strong indication that protein and caloric intake rose. In the British colonies of North America, an increase in food consumption—most importantly, of animal protein—in the beginning of the eighteenth century has been directly documented. Institutional menus also indicate that diets improved in terms of caloric content.
Changes in British income distribution conform to the above pattern. Low food prices meant that the bottom 40 percent of the distribution was gaining between 1688 and 1759, but by 1800 had declined again to the level of 1688. This trend is another indication that a substantial portion of the population that was at a nutritional disadvantage was doing better during the first half of the eighteenth century than it did earlier, but that the gains were not maintained throughout the century.
The Roots of Rural Capitalism: Western Massachusetts, 1780-1860 By Christopher Clark p. 77
Livestock also served another role, as a kind of “regulator,” balancing the economy’s need for sufficiency and the problems of producing too much. In good years, when grain and hay were plentiful, surpluses could be directed to fattening cattle and hogs for slaughter, or for exports to Boston and other markets on the hoof. Butter and cheese production would also rise, for sale as well as for family consumption. In poorer crop years, however, with feedstuffs rarer, cattle and swine could be slaughtered in greater numbers for household and local consumption, or for export as dried meat.
p. 82
Increased crop and livestock production were linked. As grain supplies began to overtake local population increases, more corn in particular became available for animal feed. Together with hay, this provided sufficient feedstuffs for farmers in the older Valley towns to undertake winter cattle fattening on a regular basis, without such concern as they had once had for fluctuations in output near the margins of subsistence. Winter fattening for market became an established practice on more farms.
But food played an even larger role in the French Revolution just a few years later. According to Cuisine and Culture: A History of Food and People, by Linda Civitello, two of the most essential elements of French cuisine, bread and salt, were at the heart of the conflict; bread, in particular, was tied up with the national identity. “Bread was considered a public service necessary to keep the people from rioting,” Civitello writes. “Bakers, therefore, were public servants, so the police controlled all aspects of bread production.”
If bread seems a trifling reason to riot, consider that it was far more than something to sop up bouillabaisse for nearly everyone but the aristocracy—it was the main component of the working Frenchman’s diet. According to Sylvia Neely’s A Concise History of the French Revolution, the average 18th-century worker spent half his daily wage on bread. But when the grain crops failed two years in a row, in 1788 and 1789, the price of bread shot up to 88 percent of his wages. Many blamed the ruling class for the resulting famine and economic upheaval.
Read more: https://www.smithsonianmag.com/arts-culture/when-food-changed-history-the-french-revolution-93598442/#veXc1rXUTkpXSiMR.99
Give the gift of Smithsonian magazine for only $12! http://bit.ly/1cGUiGv
Follow us: @SmithsonianMag on Twitter
Through 1788 and into 1789 the gods seemed to be conspiring to bring on a popular revolution. A spring drought was followed by a devastating hail storm in July. Crops were ruined. There followed one of the coldest winters in French history. Grain prices skyrocketed. Even in the best of times, an artisan or factor might spend 40 percent of his income on bread. By the end of the year, 80 percent was not unusual. “It was the connection of anger with hunger that made the Revolution possible,” observed Schama. It was also envy that drove the Revolution to its violent excesses and destructive reform.
Take the Reveillon riots of April 1789. Reveillon was a successful Parisian wall-paper manufacturer. He was not a noble but a self-made man who had begun as an apprentice paper worker but now owned a factory that employed 400 well-paid operatives. He exported his finished products to England (no mean feat). The key to his success was technical innovation, machinery, the concentration of labor, and the integration of industrial processes, but for all these the artisans of his district saw him as a threat to their jobs. When he spoke out in favor of the deregulation of bread distribution at an electoral meeting, an angry crowded marched on his factory, wrecked it, and ransacked his home.
Only in the late nineteenth and twentieth century did large numbers of “our ancestors”–and obviously this depends on which part of the world they lived in–begin eating white bread. […]
Wheat bread was for the few. Wheat did not yield well (only seven or eight grains for one planted compared to corn that yielded dozens) and is fairly tricky to grow.
White puffy wheat bread was for even fewer. Whiteness was achieved by sieving out the skin of the grain (bran) and the germ (the bit that feeds the new plant). In a world of scarcity, this made wheat bread pricey. And puffy, well, that takes fairly skilled baking plus either yeast from beer or the kind of climate that sourdough does well in. […]
Between 1850 and 1950, the price of wheat bread, even white wheat bread, plummeted in price as a result of the opening up of new farms in the US and Canada, Argentina, Australia and other places, the mechanization of plowing and harvesting, the introduction of huge new flour mills, and the development of continuous flow bakeries.
In 1800 only half the British population could afford wheat bread. In 1900 everybody could.
In Georgian times the introduction of sieves made of Chinese silk helped to produce finer, whiter flour and white bread gradually became more widespread. […]
1757
A report accused bakers of adulterating bread by using alum lime, chalk and powdered bones to keep it very white. Parliament banned alum and all other additives in bread but some bakers ignored the ban. […]
1815
The Corn Laws were passed to protect British wheat growers. The duty on imported wheat was raised and price controls on bread lifted. Bread prices rose sharply. […]
1826
Wholemeal bread, eaten by the military, was recommended as being healthier than the white bread eaten by the aristocracy.
1834
Rollermills were invented in Switzerland. Whereas stonegrinding crushed the grain, distributing the vitamins and nutrients evenly, the rollermill broke open the wheat berry and allowed easy separation of the wheat germ and bran. This process greatly eased the production of white flour but it was not until the 1870s that it became economic. Steel rollermills gradually replaced the old windmills and watermills.
1846
With large groups of the population near to starvation the Corn Laws were repealed and the duty on imported grain was removed. Importing good quality North American wheat enabled white bread to be made at a reasonable cost. Together with the introduction of the rollermill this led to the increase in the general consumption of white bread – for so long the privilege of the upper classes.
In many contexts Linné explained how people with different standing in society eat different types of bread. He wrote, “Wheat bread, the most excellent of all, is used only by high-class people”, whereas “barley bread is used by our peasants” and “oat bread is common among the poor”. He made a remark that “the upper classes use milk instead of water in the dough, as they wish to have a whiter and better bread, which thereby acquires a more pleasant taste”. He compared his own knowledge on the food habits of Swedish society with those mentioned in classical literature. Thus, according to Linné, Juvenal wrote that “a soft and snow-white bread of the finest wheat is given to the master”, while Galen condemned oat bread as suitable only for cattle, not for humans. Here Linné had to admit that it is, however, consumed in certain provinces in Sweden.
Linné was aware of and discussed the consequences of consuming less tasty and less satisfying bread, but he seems to have accepted as a fact that people belonging to different social classes should use different foods to satisfy their hunger. For example, he commented that “bran is more difficult to digest than flour, except for hard-labouring peasants and the likes, who are scarcely troubled by it”. The necessity of having to eat filling but less palatable bread was inevitable, but could be even positive from the nutritional point of view. “In Östergötland they mix the grain with flour made from peas and in Scania with vetch, so that the bread may be more nutritious for the hard-working peasants, but at the same time it becomes less flavoursome, drier and less pleasing to the palate.” And, “Soft bread is used mainly by the aristocracy and the rich, but it weakens the gums and teeth, which get too little exercise in chewing. However, the peasant folk who eat hard bread cakes generally have stronger teeth and firmer gums”.
It is intriguing that Linné did not find it necessary to discuss the consumption or effect on health of other bakery products, such as the sweet cakes, tarts, pies and biscuits served by the fashion-conscious upper class and the most prosperous bourgeois. Several cookery books with recipes for the fashionable pastry products were published in Sweden in the eighteenth century 14. The most famous of these, Hjelpreda i Hushållningen för Unga Fruentimmer by Kajsa Warg, published in 1755, included many recipes for sweet pastries 15. Linné mentioned only in passing that the addition of egg makes the bread moist and crumbly, and sugar and currants impart a good flavour.
The sweet and decorated pastries were usually consumed with wine or with the new exotic beverages, tea and coffee. It is probable that Linné regarded pastries as unnecessary luxuries, since expensive imported ingredients, sugar and spices, were indispensable in their preparation. […]
Linné emphasized that soft and fresh bread does not draw in as much saliva and thus remains undigested for a long time, “like a stone in the stomach”. He strongly warned against eating warm bread with butter. While it was “considered as a delicacy, there was scarcely another food that was more damaging for the stomach and teeth, for they were loosen’d by it and fell out”. By way of illustration he told an example reported by a doctor who lived in a town near Amsterdam. Most of the inhabitants of this town were bakers, who sold bread daily to the residents of Amsterdam and had the practice of attracting customers with oven-warm bread, sliced and spread with butter. According to Linné, this particular doctor was not surprised when most of the residents of this town “suffered from bad stomach, poor digestion, flatulence, hysterical afflictions and 600 other problems”. […]
Linné was not the first in Sweden to write about famine bread. Among his remaining papers in London there are copies from two official documents from 1696 concerning the crop failure in the northern parts of Sweden and the possibility of preparing flour from different roots, and an anonymous small paper which contained descriptions of 21 plants, the roots or leaves of which could be used for flour 10. These texts had obviously been studied by Linné with interest.
When writing about substitute breads, Linné formulated his aim as the following: “It will teach the poor peasant to bake bread with little or no grain in the circumstance of crop failure without destroying the body and health with unnatural foods, as often happens in the countryside in years of hardship” 10.
Linné’s idea for a publication on bread substitutes probably originated during his early journeys to Lapland and Dalarna, where grain substitutes were a necessity even in good years. Actually, bark bread was eaten in northern Sweden until the late nineteenth century 4. In the poorest regions of eastern and north-eastern Finland it was still consumed in the 1920s 26. […]
Bark bread has been used in the subarctic area since prehistoric times 4. According to Linné, no other bread was such a common famine bread. He described how in springtime the soft inner layer can be removed from debarked pine trees, cleaned of any remaining bark, roasted or soaked to remove the resin, and dried and ground into flour. Linné had obviously eaten bark bread, since he could say that “it tastes rather well, is however more bitter than other bread”. His view of bark bread was most positive but perhaps unrealistic: “People not only sustain themselves on this, but also often become corpulent of it, indeed long for it.” Linné’s high regard for bark bread was shared by many of his contemporaries, but not all. For example, Pehr Adrian Gadd, the first professor of chemistry in Turku (Åbo) Academy and one of the most prominent utilitarians in Finland, condemned bark bread as “useless, if not harmful to use” 28. In Sweden, Anders Johan Retzius, a professor in Lund and an expert on the economic and pharmacological potential of Swedish flora, called bark bread “a paltry food, with which they can hardly survive and of which they always after some time get a swollen body, pale and bluish skin, big and hard stomach, constipation and finally dropsy, which ends the misery” 4. […]
Linné’s investigations of substitutes for grain became of practical service when a failed harvest of the previous summer was followed by famine in 1757 10. Linné sent a memorandum to King Adolf Fredrik in the spring of 1757 and pointed out the risk to the health of the hungry people when they ignorantly chose unsuitable plants as a substitute for grain. He included a short paper on the indigenous plants which in the shortage of grain could be used in bread-making and other cooking. His Majesty immediately permitted this leaflet to be printed at public expense and distributed throughout the country 10. Soon Linné’s recipes using wild flora were read out in churches across Sweden. In Berättelse om The inhemska wäxter, som i brist af Säd kunna anwändas til Bröd- och Matredning, Linné 32 described the habitats and the popular names of about 30 edible wild plants, eight of which were recommended for bread-making.
* * *
I’ll just drop a couple videos here for general info:
Let me make an argument about (hyper-)individualism, rigid egoic boundaries, and hence Jaynesian consciousness (about Julian Jaynes, see other posts). But I’ll come at it from a less typical angle. I’ve been reading much about diet, nutrition, and health. With agriculture, the entire environment in which humans lived was fundamentally transformed, such as the rise of inequality and hierarchy, concentrated wealth and centralized power; not to mention the increase of parasites and diseases from urbanization and close cohabitation with farm animals (The World Around Us). We might be able to thank early agricultural societies, as an example, for introducing malaria to the world.
Maybe more importantly, there are significant links between what we eat and so much else: gut health, hormonal regulation, immune system, and neurocognitive functioning. There are multiple pathways, one of which is direct, connecting the gut and the brain: nervous system, immune system, hormonal system, etc — with the affect of diet and nutrition on immune response, including leaky gut, consider the lymphatic-brain link (Neuroscience News, Researchers Find Missing Link Between the Brain and Immune System) with the immune system as what some refer to as the “mobile mind” (Susan L. Prescott & Alan C. Logan, The Secret Life of Your Microbiome, pp. 64-7, pp. 249-50). As for a direct and near instantaneous gut-brain link, there was a recent discovery of the involvement of the vagus nerve, a possible explanation for the ‘gut sense’, with the key neurotransmitter glutamate modulating the rate of transmission in synaptic communication between enteroendocrine cells and vagal nerve neurons (Rich Haridy, Fast and hardwired: Gut-brain connection could lead to a “new sense”), and this is implicated in “episodic and spatial working memory” that might assist in the relocation of food sources (Rich Haridy, Researchers reveal how disrupting gut-brain communication may affect learning and memory). The gut is sometimes called the second brain because it also has neuronal cells, but in evolutionary terms it is the first brain. To demonstrate one example of a connection, many are beginning to refer to Alzheimer’s as type 3 diabetes, and dietary interventions have reversed symptoms in clinical studies. Also, gut microbes and parasites have been shown to influence our neurocognition and psychology, even altering personality traits and behavior such as with toxoplasma gondii. [For more discussion, see Fasting, Calorie Restriction, and Ketosis.]
The gut-brain link explains why glutamate as a food additive might be so problematic for so many people. Much of the research has looked at other health areas, such as metabolism or liver functioning. It would make more sense to look at its effect on neurocognition, but as with many other particles many scientists have dismissed the possibility of glutamate passing the blood-brain barrier. Yet we now know many things that were thought to be kept out of the brain do, under some conditions, get into the brain. After all, the same mechanisms that cause leaky gut (e.g., inflammation) can also cause permeability in the brain. So, we know the mechanism about how this could happen. Evidence is pointing in this direction: “MSG acts on the glutamate receptors and releases neurotransmitters which play a vital role in normal physiological as well as pathological processes (Abdallah et al., 2014[1]). Glutamate receptors have three groups of metabotropic receptors (mGluR) and four classes of ionotropic receptors (NMDA, AMPA, delta and kainite receptors). All of these receptor types are present across the central nervous system. They are especially numerous in the hypothalamus, hippocampus and amygdala, where they control autonomic and metabolic activities (Zhu and Gouaux, 2017[22]). Results from both animal and human studies have demonstrated that administration of even the lowest dose of MSG has toxic effects. The average intake of MSG per day is estimated to be 0.3-1.0 g (Solomon et al., 2015[18]). These doses potentially disrupt neurons and might have adverse effects on behaviour” (Kamal Niaz, Extensive use of monosodium glutamate: A threat to public health?).
One possibility to consider is the role of exorphins that are addictive and can be blocked in the same way as opioids. Exorphin, in fact, means external morphine-like substance, in the way that endorphin means indwelling morphine-like substance. Exorphins are found in milk and wheat. Milk, in particular, stands out. Even though exorphins are found in other foods, it’s been argued that they are insignificant because they theoretically can’t pass through the gut barrier, much less the blood-brain barrier. Yet exorphins have been measured elsewhere in the human body. One explanation is gut permeability (related to permeability throughout the body) that can be caused by many factors such as stress but also by milk. The purpose of milk is to get nutrients into the calf and this is done by widening the space in gut surface to allow more nutrients through the protective barrier. Exorphins get in as well and create a pleasurable experience to motivate the calf to drink more. Along with exorphins, grains and dairy also contain dopaminergic peptides, and dopamine is the other major addictive substance. It feels good to consume dairy as with wheat, whether you’re a calf or a human, and so one wants more. Think about that the next time you pour milk over cereal.
Addiction, of food or drugs or anything else, is a powerful force. And it is complex in what it affects, not only physiologically and psychologically but also on a social level. Johann Hari offers a great analysis in Chasing the Scream. He makes the case that addiction is largely about isolation and that the addict is the ultimate individual (see To Put the Rat Back in the Rat Park, Rationalizing the Rat Race, Imagining the Rat Park, & Individualism and Isolation), and by the way this connects to Jaynesian consciousness with its rigid egoic boundaries as opposed to the bundled and porous mind, the extended and enmeshed self of bicameralism and animism. It stands out to me that addiction and addictive substances have increased over civilization, and I’ve argued that this is about a totalizing cultural system and a fully encompassing ideological worldview, what some call a reality tunnel (see discussion of addiction and social control in Diets and Systems & Western Individuality Before the Enlightenment Age). Growing of poppies, sugar cane, etc came later on in civilization, as did the production of beer and wine — by the way, alcohol releases endorphins, sugar causes a serotonin high, and both activate the hedonic pathway. Also, grain and dairy were slow to catch on, as a large part of the diet. Until recent centuries, most populations remained dependent on animal foods, including wild game (I discuss this era of dietary transition and societal transformation in numerous posts with industrialization and technology pushing the already stressed agricultural mind to an extreme: Ancient Atherosclerosis?, To Be Fat And Have Bread, Autism and the Upper Crust, “Yes, tea banished the fairies.”, Voice and Perspective, Hubris of Nutritionism, Health From Generation To Generation, Dietary Health Across Generations, Moral Panic and Physical Degeneration, The Crisis of Identity, The Disease of Nostalgia, & Technological Fears and Media Panics). Americans, for example, ate large amounts of meat, butter, and lard from the colonial era through the 19th century (see Nina Teicholz, The Big Fat Surprise; passage quoted in full at Malnourished Americans). In 1900, Americans on average were only getting 10% of their calorie intake from carbohydrates and sugar was minimal, a potentially ketogenic diet considering how much lower calorie the average diet was back then.
Something else to consider is that low-carb diets can alter how the body and brain functions (the word ‘alter’ is inaccurate, though, since in evolutionary terms ketosis would’ve been the normal state; and so rather the modern high-carb diet is altered from the biological norm). That is even more true if combined with intermittent fasting and restricted eating times that would have been more common in the past (Past Views On One Meal A Day (OMAD)). Interestingly, this only applies to adults since we know that babies remain in ketosis during breastfeeding, there is evidence that they are already in ketosis in utero, and well into the teen years humans apparently remain in ketosis: “It is fascinating to see that every single child , so far through age 16, is in ketosis even after a breakfast containing fruits and milk” (Angela A. Stanton, Children in Ketosis: The Feared Fuel). “I have yet to see a blood ketone test of a child anywhere in this age group that is not showing ketosis both before and after a meal” (Angela A. Stanton, If Ketosis Is Only a Fad, Why Are Our Kids in Ketosis?). Ketosis is not only safe but necessary for humans (“Is keto safe for kids?”). Taken together, earlier humans would have spent more time in ketosis (fat-burning mode, as opposed to glucose-burning) which dramatically affects human biology. The further one goes back in history the greater amount of time people probably spent in ketosis. One difference with ketosis is that, for many people, cravings and food addictions disappear. [For more discussion of this topic, see previous posts: Fasting, Calorie Restriction, and Ketosis, Ketogenic Diet and Neurocognitive Health, Is Ketosis Normal?, & “Is keto safe for kids?”.] Ketosis is a non-addictive or maybe even anti-addictive state of mind (FranciscoRódenas-González, et al, Effects of ketosis on cocaine-induced reinstatement in male mice), similar to how certain psychedelics can be used to break addiction — one might argue there is a historical connection over the millennia between a decrease of psychedelic use and an increase of addictive substances: sugar, caffeine, nicotine, opium, etc (Diets and Systems, “Yes, tea banished the fairies.”, & Wealth, Power, and Addiction). Many hunter-gatherer tribes can go days without eating and it doesn’t appear to bother them, such as Daniel Everett’s account of the Piraha, and that is typical of ketosis — fasting forces one into ketosis, if one isn’t already in ketosis, and so beginning a fast in ketosis makes it even easier. This was also observed of Mongol warriors who could ride and fight for days on end without tiring or needing to stop for food. What is also different about hunter-gatherers and similar traditional societies is how communal they are or were and how more expansive their identities in belonging to a group, the opposite of the addictive egoic mind of high-carb agricultural societies. Anthropological research shows how hunter-gatherers often have a sense of personal space that extends into the environment around them. What if that isn’t merely cultural but something to do with how their bodies and brains operate? Maybe diet even plays a role. Hold that thought for a moment.
Now go back to the two staples of the modern diet, grains and dairy. Besides exorphins and dopaminergic substances, they also have high levels of glutamate, as part of gluten and casein respectively. Dr. Katherine Reid is a biochemist whose daughter was diagnosed with autism and it was severe. She went into research mode and experimented with supplementation and then diet. Many things seemed to help, but the greatest result came from restriction of dietary glutamate, a difficult challenge as it is a common food additive (see her TED talk here and another talk here or, for a short and informal video, look here). This requires going on a largely whole foods diet, that is to say eliminating processed foods (also see Traditional Foods diet of Weston A. Price and Sally Fallon Morell, along with the GAPS diet of Natasha Campbell-McBride). But when dealing with a serious issue, it is worth the effort. Dr. Reid’s daughter showed immense improvement to such a degree that she was kicked out of the special needs school. After being on this diet for a while, she socialized and communicated normally like any other child, something she was previously incapable of. Keep in mind that glutamate, as mentioned above, is necessary as a foundational neurotransmitter in modulating communication between the gut and brain. But typically we only get small amounts of it, as opposed to the large doses found in the modern diet. In response to the TED Talk given by Reid, Georgia Ede commented that it’s, “Unclear if glutamate is main culprit, b/c a) little glutamate crosses blood-brain barrier; b) anything that triggers inflammation/oxidation (i.e. refined carbs) spikes brain glutamate production.” Either way, glutamate plays a powerful role in brain functioning. And no matter the exact line of causation, industrially processed foods in the modern diet would be involved. By the way, an exacerbating factor might be mercury in its relation to anxiety and adrenal fatigue, as it ramps up the fight or flight system via over-sensitizing the glutamate pathway — could this be involved in conditions like autism where emotional sensitivity is a symptom? Mercury and glutamate simultaneously increasing in the modern world demonstrates how industrialization can push the effects of the agricultural diet to ever further extremes.
Glutamate is also implicated in schizophrenia: “The most intriguing evidence came when the researchers gave germ-free mice fecal transplants from the schizophrenic patients. They found that “the mice behaved in a way that is reminiscent of the behavior of people with schizophrenia,” said Julio Licinio, who co-led the new work with Wong, his research partner and spouse. Mice given fecal transplants from healthy controls behaved normally. “The brains of the animals given microbes from patients with schizophrenia also showed changes in glutamate, a neurotransmitter that is thought to be dysregulated in schizophrenia,” he added. The discovery shows how altering the gut can influence an animals behavior” (Roni Dengler, Researchers Find Further Evidence That Schizophrenia is Connected to Our Guts; reporting on Peng Zheng et al, The gut microbiome from patients with schizophrenia modulates the glutamate-glutamine-GABA cycle and schizophrenia-relevant behaviors in mice, Science Advances journal). And glutamate is involved in other conditions as well, such as in relation to GABA: “But how do microbes in the gut affect [epileptic] seizures that occur in the brain? Researchers found that the microbe-mediated effects of the Ketogenic Diet decreased levels of enzymes required to produce the excitatory neurotransmitter glutamate. In turn, this increased the relative abundance of the inhibitory neurotransmitter GABA. Taken together, these results show that the microbe-mediated effects of the Ketogenic Diet have a direct effect on neural activity, further strengthening support for the emerging concept of the ‘gut-brain’ axis.” (Jason Bush, Important Ketogenic Diet Benefit is Dependent on the Gut Microbiome). Glutamate is one neurotransmitter among many that can be affected in a similar manner; e.g., serotonin is also produced in the gut.
That reminds me of propionate, a short chain fatty acid and the conjugate base of propioninic acid. It is another substance normally taken in at a low level. Certain foods, including grains and dairy, contain it. The problem is that, as a useful preservative, it has been generously added to the food supply. Research on rodents shows injecting them with propionate causes autistic-like behaviors. And other rodent studies show how this stunts learning ability and causes repetitive behavior (both related to the autistic demand for the familiar), as too much propionate entrenches mental patterns through the mechanism that gut microbes use to communicate to the brain how to return to a needed food source, similar to the related function of glutamate. A recent study shows that propionate not only alters brain functioning but brain development (L.S. Abdelli et al, Propionic Acid Induces Gliosis and Neuro-inflammation through Modulation of PTEN/AKT Pathway in Autism Spectrum Disorder), and this is a growing field of research (e.g., Hyosun Choi, Propionic acid induces dendritic spine loss by MAPK/ERK signaling and dysregulation of autophagic flux). As reported by Suhtling Wong-Vienneau at University of Central Florida, “when fetal-derived neural stem cells are exposed to high levels of Propionic Acid (PPA), an additive commonly found in processed foods, it decreases neuron development” (Processed Foods May Hold Key to Rise in Autism). This study “is the first to discover the molecular link between elevated levels of PPA, proliferation of glial cells, disturbed neural circuitry and autism.”
The impact is profound and permanent — Pedersen offers the details: “In the lab, the scientists discovered that exposing neural stem cells to excessive PPA damages brain cells in several ways: First, the acid disrupts the natural balance between brain cells by reducing the number of neurons and over-producing glial cells. And although glial cells help develop and protect neuron function, too many glia cells disturb connectivity between neurons. They also cause inflammation, which has been noted in the brains of autistic children. In addition, excessive amounts of the acid shorten and damage pathways that neurons use to communicate with the rest of the body. This combination of reduced neurons and damaged pathways hinder the brain’s ability to communicate, resulting in behaviors that are often found in children with autism, including repetitive behavior, mobility issues and inability to interact with others.” According to this study, “too much PPA also damaged the molecular pathways that normally enable neurons to send information to the rest of the body. The researchers suggest that such disruption in the brain’s ability to communicate may explain ASD-related characteristics such as repetitive behavior and difficulties with social interaction” (Ana Sandoiu, Could processed foods explain why autism is on the rise?).
So, the autistic brain develops according to higher levels of propionate and maybe becomes accustomed to it. A state of dysfunction becomes what feels normal. Propionate causes inflammation and, as Dr. Ede points out, “anything that triggers inflammation/oxidation (i.e. refined carbs) spikes brain glutamate production”. High levels of propionate and glutamate become part of the state of mind the autistic becomes identified with. It all links together. Autistics, along with cravings for foods containing propionate (and glutamate), tend to have larger populations of a particular gut microbe that produces propionate. In killing microbes, this might be why antibiotics can help with autism. But in the case of depression, gut issues are associated instead with the lack of certain microbes that produce butyrate, another important substance that also is found in certain foods (Mireia Valles-Colomer et al, The neuroactive potential of the human gut microbiota in quality of life and depression). Depending on the specific gut dysbiosis, diverse neurocognitive conditions can result. And in affecting the microbiome, changes in autism can be achieved through a ketogenic diet, temporarily reducing the microbiome (similar to an antibiotic) — this presumably takes care of the problematic microbes and readjusts the gut from dysbiosis to a healthier balance. Also, ketosis would reduce the inflammation that is associated with glutamate production.
As with propionate, exorphins injected into rats will likewise elicit autistic-like behaviors. By two different pathways, the body produces exorphins and propionate from the consumption of grains and dairy, the former from the breakdown of proteins and the latter produced by gut bacteria in the breakdown of some grains and refined carbohydrates (combined with the propionate used as a food additive; and also, at least in rodents, artificial sweeteners increase propionate levels). [For related points and further discussion, see section below about vitamin B1 (thiamine/thiamin). Also covered are other B vitamins and nutrients.] This is part of the explanation for why many autistics have responded well to ketosis from carbohydrate restriction, specifically paleo diets that eliminate both wheat and dairy, but ketones themselves play a role in using the same transporters as propionate and so block their buildup in cells and, of course, ketones offer a different energy source for cells as a replacement for glucose which alters how cells function, specifically neurocognitive functioning and its attendant psychological effects.
There are some other factors to consider as well. With agriculture came a diet high in starchy carbohydrates and sugar. This inevitably leads to increased metabolic syndrome, including diabetes. And diabetes in pregnant women is associated with autism and attention deficit disorder in children. “Maternal diabetes, if not well treated, which means hyperglycemia in utero, that increases uterine inflammation, oxidative stress and hypoxia and may alter gene expression,” explained Anny H. Xiang. “This can disrupt fetal brain development, increasing the risk for neural behavior disorders, such as autism” (Maternal HbA1c influences autism risk in offspring); by the way, other factors such as getting more seed oils and less B vitamins are also contributing factors to metabolic syndrome and altered gene expression, including being inherited epigenetically, not to mention mutagenic changes to the genes themselves (Catherine Shanahan, Deep Nutrition). The increase of diabetes, not mere increase of diagnosis, could partly explain the greater prevalence of autism over time. Grain surpluses only became available in the 1800s, around the time when refined flour and sugar began to become common. It wasn’t until the following century that carbohydrates finally overtook animal foods as the mainstay of the diet, specifically in terms of what is most regularly eaten throughout the day in both meals and snacks — a constant influx of glucose into the system.
A further contributing factor in modern agriculture is that of pesticides, also associated with autism. Consider DDE, a product of DDT, which has been banned for decades but apparently it is still lingering in the environment. “The odds of autism among children were increased, by 32 percent, in mothers whose DDE levels were high (high was, comparatively, 75th percentile or greater),” one study found (Aditi Vyas & Richa Kalra, Long lingering pesticides may increase risk for autism: Study). “Researchers also found,” the article reports, “that the odds of having children on the autism spectrum who also had an intellectual disability were increased more than two-fold when the mother’s DDE levels were high.” A different study showed a broader effect in terms of 11 pesticides still in use:
“They found a 10 percent or more increase in rates of autism spectrum disorder, or ASD, in children whose mothers lived during pregnancy within about a mile and a quarter of a highly sprayed area. The rates varied depending on the specific pesticide sprayed, and glyphosate was associated with a 16 percent increase. Rates of autism spectrum disorders combined with intellectual disability increased by even more, about 30 percent. Exposure after birth, in the first year of life, showed the most dramatic impact, with rates of ASD with intellectual disability increasing by 50 percent on average for children who lived within the mile-and-a-quarter range. Those who lived near glyphosate spraying showed the most increased risk, at 60 percent” (Nicole Ferox, It’s Personal: Pesticide Exposures Come at a Cost).
An additional component to consider are plant anti-nutrients. For example, oxalates may be involved in autism spectrum disorder (Jerzy Konstantynowicz et al, A potential pathogenic role of oxalate in autism). With the end of the Ice Age, vegetation became more common and some of the animal foods less common. That increased plant foods as part of the human diet. But even then it was limited and seasonal. The dying off of the megafauna was a greater blow, as it forced humans to both rely on less desirable lean meats from smaller prey but also more plant foods. And of course, the agricultural revolution followed shortly after that with its devastating effects. None of these changes were kind to human health and development, as the evidence shows in the human bones and mummies left behind. Yet they were minor compared to what was to come. The increase of plant foods was a slow process over millennia. All the way up to the 19th century, Americans were eating severely restricted amounts of plant foods and instead depending on fatty animal foods, from pasture-raised butter and lard to wild-caught fish and deer — the abundance of wilderness and pasturage made such foods widely available, convenient, and cheap, besides being delicious and nutritious. Grain crops and vegetable gardens were simply too hard to grow, as described by Nina Teicholz in The Big Fat Surprise (see quoted passage at Malnourished Americans).
While maintaining a garden at Walden Pond by growing beans, peas, corn, turnips and potatoes, a plant-based diet (Jennie Richards, Henry David Thoreau Advocated “Leaving Off Eating Animals”) surely contributed to Henry David Thoreau’s declining health from tuberculosis in weakening his immune system from deficiency in the fat-soluble vitamins, although his nearby mother occasionally made him a fruit pie that would’ve had nutritious lard in the crust: “lack of quality protein and excess of carbohydrate foods in Thoreau’s diet as probable causes behind his infection” (Dr. Benjamin P. Sandler, Thoreau, Pulmonary Tuberculosis and Dietary Deficiency). Likewise, Franz Kafka who became a vegetarian also died from tuberculosis (Old Debates Forgotten). Weston A. Price observed the link between deficiency of fat-soluble vitamins and high rates of tuberculosis, not that one causes the other but that nutritious diet is key to a strong immune system (Dr. Kendrick On Vaccines & Moral Panic and Physical Degeneration). Besides, eliminating fatty animal foods typically means increasing starchy and sugary plant foods, which lessens the anti-inflammatory response from ketosis and autophagy and hence the capacity for healing.
It should be re-emphasized the connection of physical health to mental health, another insight of Price. Interestingly, Kafka suffered from psychological, presumably neurocognitive, issues long before tubercular symptoms showed up and he came to see the link between them as causal, although he saw it the the other way around as psychosomatic. Even more intriguing, Kafka suggests that, as Sander L. Gilman put it, “all urban dwellers are tubercular,” as if it is a nervous condition of modern civilization akin to what used to be called neurasthenia (about Kafka’s case, see Sander L. Gilman’s Franz Kafka, the Jewish Patient). He even uses the popular economic model of energy and health: “For secretly I don’t believe this illness to be tuberculosis, at least primarily tuberculosis, but rather a sign of general bankruptcy” (for context, see The Crisis of Identity). Speaking of the eugenic, hygienic, sociological and aesthetic, Gillman further notes that, “For Kafka, that possibility is linked to the notion that illness and creativity are linked, that tuberculars are also creative geniuses,” indicating an interpretation of neurasthenia among the intellectual class, an interpretation that was more common in the United States than in Europe.
The upper classes were deemed the most civilized and so it was expected they they’d suffer the most from the diseases of civilization, and indeed the upper classes fully adopted the modern industrial diet before the rest of the population. In contrast, while staying at a sanatorium (a combination of the rest cure and the west cure), Kafka stated that, “I am firmly convinced, now that I have been living here among consumptives, that healthy people run no danger of infection. Here, however, the healthy are only the woodcutters in the forest and the girls in the kitchen (who will simply pick uneaten food from the plates of patients and eat it—patients whom I shrink from sitting opposite) but not a single person from our town circles,” from a letter to Max Brod on March 11, 1921. It should be pointed out that tuberculosis sanatoriums were typically located in rural mountain areas where local populations were known to be healthy, the kinds of communities Weston A. Price studied in the 1930s; a similar reason for why in America tuberculosis patients were sometimes sent west (the west cure) for clean air and a healthy lifestyle, probably with an accompanying change toward a rural diet, with more wild-caught animal foods higher in omega-3s and lower in omega-6s, not to mention higher in fat-soluble vitamins.
The historical context of public health overlapped with racial hygiene, and indeed some of Kafka’s family members and lovers would later die at the hands of Nazis. Eugenicists were obsessed with body types in relation to supposed racial features, but non-eugenicists also accepted that physical structure was useful information to be considered; and this insight is supported, if not the eugenicist ideology, by the more recent scientific measurements of stunted bone development in the early agricultural societies. Hermann Brehmer, a founder of the sanitorium movement, asserted that a particular body type (habitus phthisicus, equivalent to habitus asthenicus) was associated with tuberculosis, the kind of thinking that Weston A. Price would pick up in his observations in physical development, although Price saw the explanation as dietary and not racial. The other difference is that Price saw “body type” not as a cause but as a symptom of ill health, and so the focus on re-forming the body (through lung exercises, orthopedic corsets, etc) to improve health was not the most helpful advice. On the other hand, if re-forming the body involved something like the west cure in changing the entire lifestyle and environmental conditions, it might work by way of changing other factors of health and, along with diet, exercise and sunshine and clean air and water would definitely improve immune function, lower inflammation, and much else (sanitoriums prioritized such things as getting plenty of sunshine and dairy, both of which would increase vitamin D3 that is necessary for immunological health). Improvements in physical health, of course, would go hand in hand with that of mental health. An example of this is that winter conceptions, when vitamin D3 production is low, result in higher rates later on of childhood learning disabilities and other problems in neurocognitive development (BBC, Learning difficulties linked with winter conception).
As a side note, physical development was tied up with gender issues and gender roles, especially for boys in becoming men. There became a fear that the newer generations of urban youth were failing to develop properly, physically and mentally, morally and socially. Fitness became a central concern for the civilizational project and it was feared that we modern humans might fail this challenge. Most galling of all was ‘feminization’, not only about loss of an athletic build but loss of something to the masculine psychology, involving the depression and anxiety, sensitivity and weakness of conditions like neurasthenia while also overlapping with tubercular consumption. Some of this could be projected onto racial inferiority, far from being limited to the distinction between those of European descent and all others for it also was used to divide humanity up in numerous ways (German vs French, English vs Irish, North vs South, rich vs poor, Protestants vs Catholics, Christians vs Jews, etc).
Gender norms were applied to all aspects of health and development, including perceived moral character and personality disposition. This is a danger to the individual, but also potentially a danger to society. “Here we can return for the moment to the notion that the male Jew is feminized like the male tubercular. The tubercular’s progressive feminization begins in the middle of the nineteenth century with the introduction of the term: infemminire, to feminize, which is supposedly a result of male castration. By the 1870s, the term is used to describe the feminisme of the male through the effects of other disease, such as tuberculosis. Henry Meige, at the Salpetriere, saw this feminization as an atavism, in which the male returns to the level of the “sexless” child. Feminization is therefore a loss, which can cause masturbation and thus illness in certain predisposed individuals. It is also the result of actual castration or its physiological equivalent, such as an intensely debilitating illness like tuberculosis, which reshapes the body” (Sanders L. Gilman, Franz Kafka, the Jewish Patient). There was a fear that all of civilization was becoming effeminate, especially among the upper classes who were expected to be the leaders. That was the entire framework of neurasthenia-obsessed rhetoric in late nineteenth to early twentieth century America. The newer generations of boys, the argument went, were somehow deficient and inadequate. Looking back on that period, there is no doubt that physical and mental illness was increasing, while bone structure was becoming underdeveloped in a way one could perceive as effeminate; such bone development problems are particularly obvious among children raised on plant-based diets, especially veganism and near-vegan vegetarianism, but also anyone on a diet lacking nutritious animal foods.
Let me make one odd connection before moving on. The Seventh Day Adventist Dr. John Harvey Kellogg believed masturbation was both a moral sin and a cause of ill health but also a sign of inferiority, and his advocacy of a high-fiber vegan diet including breakfast cereals was based on the Galenic theory that such foods decreased libido. Dr. Kellogg was also an influential eugenicist and operated a famous sanitorium. He wasn’t alone in blaming masturbation for disease. The British Dr. D. G. Macleod Munro treated masturbation as a contributing factor for tuberculosis: “the advent of the sexual appetite in normal adolescence has a profound effect upon the organism, and in many cases when uncontrolled, leads to excess about the age when tuberculosis most frequently delivers its first open assault upon the body,” as quoted by Gilman. This related to the ‘bankruptcy’ Kafka mentioned, the idea that one could waste one’s energy reserves. Maybe there is an insight in this belief, despite it being misguided and misinterpreted. The source of the ‘bankruptcy’ may have in part been a nutritional debt and certainly a high-fiber vegan diet would not refill ones energy and nutrient reserves as an investment in one’s health — hence, the public health risk of what one might call a hyper-agricultural diet as exemplified by the USDA dietary recommendations and corporate-backed dietary campaigns like EAT-Lancet (Dietary Dictocrats of EAT-Lancet; & Corporate Veganism), but it’s maybe reversing course, finally (Slow, Quiet, and Reluctant Changes to Official Dietary Guidelines; American Diabetes Association Changes Its Tune; & Corporate Media Slowly Catching Up With Nutritional Studies).
So far, my focus has mostly been on what we ingest or are otherwise exposed to because of agriculture and the food system, in general and more specifically in industrialized society with its refined, processed, and adulterated foods, largely from plants. But the other side of the picture is what our diet is lacking, what we are deficient in. As I touched upon directly above, an agricultural diet hasn’t only increased certain foods and substances but simultaneously decreased others. What promoted optimal health throughout human evolution has, in many cases, been displaced or interrupted. Agriculture is highly destructive and has depleted the nutrient-level in the soil (Carnivore Is Vegan) and, along with this, even animal foods as part of the agricultural system are similarly depleted of nutrients as compared to animal foods from pasture or free-range. For example, fat-soluble vitamins (true vitamin A as retinol, vitamin D3, vitamin K2 not to be confused with K1, and vitamin E complex) are not found in plant foods and are found in far less concentration with foods from animals from factory-farming or from grazing on poor soil from agriculture, especially the threat of erosion and desertification. Rhonda Patrick points to deficiencies of vitamin D3, EPA and DHA and hence insufficient serotonin levels as being causally linked to autism, ADHD, bipolar disorder, schizophrenia, etc (TheIHMC, Rhonda Patrick on Diet-Gene Interactions, Epigenetics, the Vitamin D-Serotonin Link and DNA Damage). She also discusses inflammation, epigenetics, and DNA damage which relates to the work by others (Dr. Catherine Shanahan On Dietary Epigenetics and Mutations).
One of the biggest changes with agriculture was the decrease of fatty animal foods that were nutrient-dense and nutrient-bioavailable. It’s in the fat that are found the fat-soluble vitamins and fat is necessary for their absorption (i.e., fat-soluble), and these key nutrients relate to almost everything else such as minerals as calcium and magnesium that also are found in animal foods (Calcium: Nutrient Combination and Ratios); the relationship of seafood with the balance of sodium, magnesium, and potassium is central (On Salt: Sodium, Trace Minerals, and Electrolytes) and indeed populations that eat more seafood live longer. These animal foods used to hold the prized position in the human diet and the earlier hominid diet as well, as part of our evolutionary inheritance from millions of years of adaptation to a world where fatty animals once were abundant (J. Tyler Faith, John Rowan & Andrew Du, Early hominins evolved within non-analog ecosystems). That was definitely true in the paleolithic before the megafauna die-off, but even to this day hunter-gatherers when they have access to traditional territory and prey will seek out the fattest animals available, entirely ignoring lean animals because rabbit sickness is worse than hunger (humans can always fast for many days or weeks, if necessary, and as long as they have reserves of body fat they can remain perfectly healthy).
We’ve already discussed autism in terms of many other dietary factors, especially excesses of otherwise essential nutrients like glutamate, propionate, and butyrate. But like most modern people, those on the autistic spectrum can be nutritionally deficient in other ways and unsurprisingly that would involve fat-soluble vitamins. In a fascinating discussion one of her more recent books, Nourishing Fats, Sally Fallon Morell offers a hypothesis of an indirect causal mechanism. First off, she notes that, “Dr. Mary Megson of Richmond, Virginia, had noticed that night blindness and thyroid conditions—both signs of vitamin A deficiency—were common in family members of autistic children” (p. 156), and so indicating a probable deficiency of the same in the affected child. This might be why supplementing cod liver oil, high in true vitamin A, helps with autistic issues. “As Dr. Megson explains, in genetically predisposed children, autism is linked to a G-alpha protein defect. G-alpha proteins form one of the most prevalent signaling systems in our cells, regulating processes as diverse as cell growth, hormonal regulation and sensory perception—like seeing” (p. 157).
The sensory issues common among autistics may seem to be neurocognitive in origin, but the perceptual and psychological effects may be secondary to the real cause in altered eye development. Because the rods in their eyes don’t function properly, they have distorted vision that is experienced as a blurry and divided visual field, like a magic-eye puzzle, that takes constant effort in making coherent sense of the world around them. “According to Megson, the blocked visual pathways explain why children on the autism spectrum “melt down” when objects are moved or when you clean up their lines or piles of toys sorted by color They work hard to piece together their world; it frightens and overwhelms them when the world as they are able to see it changes. It also might explain why children on the autism spectrum spend time organizing tings so carefully. It’s the only way they can “see” what’s out there” (p. 157). The rods at the edge of their vision work better and so they prefer to not look directly at people.
The vitamin A link is not merely speculative. In other aspects seen in autism, studies have sussed out some of the proven and possible factors and mechanisms: “Decreased vitamin A, and its retinoic acid metabolites, lead to a decrease in CD38 and associated changes that underpin a wide array of data on the biological underpinnings of ASD, including decreased oxytocin, with relevance both prenatally and in the gut. Decreased sirtuins, poly-ADP ribose polymerase-driven decreases in nicotinamide adenine dinucleotide (NAD+), hyperserotonemia, decreased monoamine oxidase, alterations in 14-3-3 proteins, microRNA alterations, dysregulated aryl hydrocarbon receptor activity, suboptimal mitochondria functioning, and decreases in the melatonergic pathways are intimately linked to this. Many of the above processes may be modulating, or mediated by, alterations in mitochondria functioning. Other bodies of data associated with ASD may also be incorporated within these basic processes, including how ASD risk factors such as maternal obesity and preeclampsia, as well as more general prenatal stressors, modulate the likelihood of offspring ASD” (Michael Maes et al, Integrating Autism Spectrum Disorder Pathophysiology: Mitochondria, Vitamin A, CD38, Oxytocin, Serotonin and Melatonergic Alterations in the Placenta and Gut). By the way, some of those involved pathways are often discussed in terms of longevity, which indicates autistics might be at risk for shortened lifespan. Autism, indeed, is comorbid with numerous other health issues and genetic syndromes. So autism isn’t just an atypical expression on a healthy spectrum of neurodiversity.
The affect of the agricultural diet, especially in its industrially-processed variety, has a powerful impact on numerous systems simultaneously, as autism demonstrates. There is unlikely any single causal factor and causal mechanism with most other health conditions as well. We can take this a step further. With historical changes in diet, it wasn’t only fat-soluble vitamins that were lost. Humans traditionally ate nose-to-tail and this brought with it a plethora of nutrients, even some thought of as being only sourced from plant foods. In its raw or lightly cooked form, meat has more than enough vitamin C for a low-carb diet; whereas a high-carb diet, since glucose competes with vitamin C, requires higher intake of this antioxidant which can lead to deficiencies at levels that otherwise would be adequate (Sailors’ Rations, a High-Carb Diet). Also, consider that prebiotics can be found in animal foods as well and animal-based prebiotics likely feeds a very different kind of microbiome that could shift so much else in the body, such as neurotransmitter production: “I found this list of prebiotic foods that were non-carbohydrate that included cellulose, cartilage, collagen, fructooligosaccharides, glucosamine, rabbit bone, hair, skin, glucose. There’s a bunch of things that are all — there’s also casein. But these tend to be some of the foods that actually have some of the highest prebiotic content,” from Vanessa Spina as quoted in Fiber or Not: Short-Chain Fatty Acids and the Microbiome).
Let’s briefly mention fat-soluble vitamins again in making a point about other animal-based nutrients. Fat-soluble vitamins, similar to ketosis and autophagy, have a profound effect on human biological functioning, including that of the mind (see the work of Weston A. Price as discussed in Health From Generation To Generation; also see the work of those described in Physical Health, Mental Health). In many ways, they are closer to hormones than mere nutrients, as they orchestrate entire systems in the body and how other nutrients get used, particularly seen with vitamin K2 that Weston A. Price discovered in calling it “Activator X” (only found in animal and fermented foods, not in whole or industrially-processed plant foods). I bring this up because some other animal-based nutrients play a similar important role. Consider glycine that is the main amino acid in collagen. It is available in connective tissues and can be obtained through soups and broths made from bones, skin, ligaments, cartilage, and tendons. Glycine is right up there with the fat-soluble vitamins in being central to numerous systems, processes, and organs.
As I’ve already discussed glutamate at great length, let me further that discussion by pointing out a key link. “Glycine is found in the spinal cord and brainstem where it acts as an inhibitory neurotransmitter via its own system of receptors,” writes Afifah Hamilton. “Glycine receptors are ubiquitous throughout the nervous system and play important roles during brain development. [Ito, 2016] Glycine also interacts with the glutaminergic neurotransmission system via NMDA receptors, where both glycine and glutamate are required, again, chiefly exerting inhibitory effects” (10 Reasons To Supplement With Glycine). Hamilton elucidates the dozens of roles played by this master nutrient and the diverse conditions that follow from its deprivation or insufficiency — it’s implicated in obsessive compulsive disorder, schizophrenia, and alcohol use disorder, along with much else such as metabolic syndrome. But it’s being essential to glutamate really stands out for this discussion. “Glutathione is synthesised,” Hamilton further explains, “from the amino acids glutamate, cysteine, and glycine, but studies have shown that the rate of synthesis is primarily determined by levels of glycine in the tissue. If there is insufficient glycine available the glutathione precursor molecules are excreted in the urine. Vegetarians excrete 80% more of these precursors than their omnivore counterparts indicating a more limited ability to complete the synthesis process.” Did you catch what she is saying there? Autistics already have too much glutamate and, if they are deficient in glycine, they won’t be able to convert glutamate into the important glutathione. When the body is overwhelmed with unused glutamate, it does what it can to eliminate them, but when constantly flooded with high-glutamate intake it can’t keep up. The excess glutamate then wreaks havoc on neurocognitive functioning.
The whole mess of the agricultural diet, specifically in its modern industrialized form, has been a constant onslaught taxing our bodies and minds. And the consequences are worsening with each generation. What stands out to me about autism, in particular, is how isolating it is. The repetitive behavior and focus on objects to the exclusion of human relationships resonates with how addiction isolates the individual. As with other conditions influenced by diet (shizophrenia, ADHD, etc), both autism and addiction block normal human relating in creating an obsessive mindset that, in the most most extreme forms, blocks out all else. I wonder if all of us moderns are simply expressing milder varieties of this biological and neurological phenomenon (Afifah Hamilton, Why No One Should Eat Grains. Part 3: Ten More Reasons to Avoid Wheat). And this might be the underpinning of our hyper-individualistic society, with the earliest precursors showing up in the Axial Age following what Julian Jaynes hypothesized as the breakdown of the much more other-oriented bicameral mind. What if our egoic consciousness with its rigid psychological boundaries is the result of our food system, as part of the civilizational project of mass agriculture?
* * *
Mongolian Diet and Fasting:
“Heaven grew weary of the excessive pride and luxury of China… I am from the Barbaric North. I wear the same clothing and eat the same food as the cowherds and horse-herders. We make the same sacrifices and we share our riches. I look upon the nation as a new-born child and I care for my soldiers as though they were my brothers.”
~Genghis Khan, letter of invitation to Ch’ang Ch’un
For anyone who is curious to learn more, the original point of interest was a quote by Jack Weatherford in his book Genghis Khan and the Making of the Modern World. He wrote that, “The Chinese noted with surprise and disgust the ability of the Mongol warriors to survive on little food and water for long periods; according to one, the entire army could camp without a single puff of smoke since they needed no fires to cook. Compared to the Jurched soldiers, the Mongols were much healthier and stronger. The Mongols consumed a steady diet of meat, milk, yogurt, and other diary products, and they fought men who lived on gruel made from various grains. The grain diet of the peasant warriors stunted their bones, rotted their teeth, and left them weak and prone to disease. In contrast, the poorest Mongol soldier ate mostly protein, thereby giving him strong teeth and bones. Unlike the Jurched soldiers, who were dependent on a heavy carbohydrate diet, the Mongols could more easily go a day or two without food.” By the way, that biography was written by an anthropologist who lived among and studied the Mongols for years. It is about the historical Mongols, but filtered through the direct experience of still existing Mongol people who have maintained a traditional diet and lifestyle longer than most other populations.
As nomadic herders living on arid grasslands with no option of farming, they had limited access to plant foods from foraging and so their diet was more easily applied to horseback warfare, even over long distances when food stores ran out. That meant, when they had nothing else, on “occasion they will sustain themselves on the blood of their horses, opening a vein and letting the blood jet into their mouths, drinking till they have had enough, and then staunching it.” They could go on “quite ten days like this,” according to Marco Polo’s observations. “It wasn’t much,” explained Logan Nye, “but it allowed them to cross the grasses to the west and hit Russia and additional empires. […]On the even darker side, they also allegedly ate human flesh when necessary. Even killing the attached human if horses and already-dead people were in short supply” (How Mongol hordes drank horse blood and liquor to kill you). The claim of their situational cannibalism came from the writings of Giovanni da Pian del Carpini who noted they’d eat anything, even lice. The specifics of what they ate was also determined by season: “Generally, the Mongols ate dairy in the summer, and meat and animal fat in the winter, when they needed the protein for energy and the fat to help keep them warm in the cold winters. In the summers, their animals produced a lot of milk so they switched the emphasis from meat to milk products” (from History on the Net, What Did the Mongols Eat?). In any case, animal foods were always the staple.
By the way, some have wondered how long humans have been consuming dairy, since the gene for lactose tolerance is fairly recent. In fact, “a great many Mongolians, both today and in Genghis Khan’s time are lactose intolerant. Fermentation breaks down the lactose, removing it almost entirely, making it entirely drinkable to the Mongols” (from Exploring History, Food That Conquered The World: The Mongols — Nomads And Chaos). Besides mare’s milk fermented into alcohol, they had a wide variety of other cultured dairy and aged cheese. Even then, much of the dairy would contain significant amounts of lactose. A better explanation is that many of the dairy-loving microbes have been incorporated into the Mongolian microbiome, and these microbes in combination as a microbial ecosystem do some combination of: digest lactose, moderate the effects of lactose intolerance, and/or somehow alter the body’s response to lactose. But looking at a single microbe might not tell us much. “Despite the dairy diversity she saw,” wrote Andrew Curry, “an estimated 95 percent of Mongolians are, genetically speaking, lactose intolerant. Yet, in the frost-free summer months, she believes they may be getting up to half their calories from milk products. […] Rather than a previously undiscovered strain of microbes, it might be a complex web of organisms and practices—the lovingly maintained starters, the milk-soaked felt of the yurts, the gut flora of individual herders, the way they stir their barrels of airag—that makes the Mongolian love affair with so many dairy products possible” (The answer to lactose intolerance might be in Mongolia).
Here is what is interesting. Based on study of ancient corpses, it’s been determined that lactose intolerant people in this region have been including dairy in their diet for 5,000 years. It’s not limited to the challenge of lactose intolerant people depending on a food staple that is abundant in lactose. The Mongolian population also has high rates of carrying the APOE4 gene variation that can make problematic a diet high in saturated fat (Helena Svobodová et al, Apolipoprotein E gene polymorphism in the Mongolian population). That is a significant detail, considering dairy has a higher amount of saturated fat than any other food. These people should be keeling over with nearly every disease known to humanity, particularly as they commonly drink plenty of alcohol and smoke tobacco (as was likewise true of the heart-healthy and long-lived residents of mid-20th century Roseto, Pennsylvania with their love of meat, lard, alcohol, and tobacco; see Blue Zones Dietary Myth). Yet, it’s not the traditional Mongolians but the the industrialized Mongolians who show all the health problems. A major difference between these two populations in Mongolia is diet, much of it being a difference of much low-carb animal foods eaten versus the amount of high-carb plant foods. Genetics are not deterministic, not in the slightest. As some others have noted, the traditional Mongolian diet would be accurately described as a low-carb paleo diet that, in the wintertime, would often have been a strict carnivore diet and ketogenic diet; although even rural Mongolians, unlike in the time of Genghis Khan, now get a bit more starchy agricultural foods. Maybe there is a protective health factor found in a diet that relies on nutrient-dense animal foods and leans toward the ketogenic.
It isn’t only that the Mongolian diet was likely ketogenic because of being low-carbohydrate, particularly on their meat-based winter diet, but also because it involved fasting. From Mongolia Volume 1 The Tangut Country, and the Solitudes of Northernin (1876), Nikolaĭ Mikhaĭlovich Przhevalʹskiĭ writes in the second note on p. 65 under the section Calendar and Year-Cycle: “On the New Year’s Day, or White Feast of the Mongols, see ‘Marco Polo’, 2nd ed. i. p. 376-378, and ii. p. 543. The monthly fetival days, properly for the Lamas days of fasting and worship, seem to differ locally. See note in same work, i. p. 224, and on the Year-cycle, i. p. 435.” This is alluded to in another text, in describing that such things as fasting were the norm of that time: “It is well known that both medieval European and traditional Mongolian cultures emphasized the importance of eating and drinking. In premodern societies these activities played a much more significant role in social intercourse as well as in religious rituals (e.g., in sacrificing and fasting) than nowadays” (Antti Ruotsala, Europeans and Mongols in the middle of the thirteenth century, 2001). A science journalist trained in biology, Dyna Rochmyaningsih, also mentions this: “As a spiritual practice, fasting has been employed by many religious groups since ancient times. Historically, ancient Egyptians, Greeks, Babylonians, and Mongolians believed that fasting was a healthy ritual that could detoxify the body and purify the mind” (Fasting and the Human Mind).
Mongol shamans and priests fasted, no different than in so many other religions, but so did other Mongols — more from Przhevalʹskiĭ’s 1876 account showing the standard feast and fast cycle of many traditional ketogenic diets: “The gluttony of this people exceeds all description. A Mongol will eat more than ten pounds of meat at one sitting, but some have been known to devour an average-sized sheep in twenty-four hours! On a journey, when provisions are economized, a leg of mutton is the ordinary daily ration for one man, and although he can live for days without food, yet, when once he gets it, he will eat enough for seven” (see more quoted material in Diet of Mongolia). Fasting was also noted of earlier Mongols, such as Genghis Khan: “In the spring of 2011, Jenghis Khan summoned his fighting forces […] For three days he fasted, neither eating nor drinking, but holding converse with the gods. On the fourth day the Khakan emerged from his tent and announced to the exultant multitude that Heaven had bestowed on him the boon of victory” (Michael Prawdin, The Mongol Empire, 1967). Even before he became Khan, this was his practice as was common among the Mongols, such that it became a communal ritual for the warriors:
“When he was still known as Temujin, without tribe and seeking to retake his kidnapped wife, Genghis Khan went to Burkhan Khaldun to pray. He stripped off his weapons, belt, and hat – the symbols of a man’s power and stature – and bowed to the sun, sky, and mountain, first offering thanks for their constancy and for the people and circumstances that sustained his life. Then, he prayed and fasted, contemplating his situation and formulating a strategy. It was only after days in prayer that he descended from the mountain with a clear purpose and plan that would result in his first victory in battle. When he was elected Khan of Khans, he again retreated into the mountains to seek blessing and guidance. Before every campaign against neighboring tribes and kingdoms, he would spend days in Burhkhan Khandun, fasting and praying. By then, the people of his tribe had joined in on his ritual at the foot of the mountain, waiting his return” (Dr. Hyun Jin Preston Moon, Genghis Khan and His Personal Standard of Leadership).
As an interesting side note, the Mongol population have been studied to some extent in one area of relevance. In Down’s Anomaly (1976), Smith et al writes that, “The initial decrease in the fasting blood sugar was greater than that usually considered normal and the return to fasting blood sugar level was slow. The results suggested increased sensitivity to insulin. Benda reported the initial drop in fating blood sugar to be normal but the absolute blood sugar level after 2 hours was lower for mongols than for controls.” That is probably the result of a traditional low-carb diet that had been maintained continuously since before history. For some further context, I noticed some discussion about the Mongolian keto diet (Reddit, r/keto, TIL that Ghenghis Khan and his Mongol Army ate a mostly keto based diet, consisting of lots of milk and cheese. The Mongols were specially adapted genetically to digest the lactase in milk and this made them easier to feed.) that was inspired by the scientific documentary “The Evolution of Us” (presently available on Netflix and elsewhere).
As a concluding thought, we may have the Mongols to thank for the modern American hamburger: “Because their cavalry was traveling so much, they would often eat while riding their horses towards their next battle. The Mongol soldiers would soften scraps of meat by placing it under their saddles while they rode. By the time the Mongols had time for a meal, the meat would be “tenderized” and consumed raw. […] By no means did the Mongols have the luxury of eating the kind of burgers we have today, but it was the first recorded time that meat was flattened into a patty-like shape” (Anna’s House, Brunch History: The Shocking Hamburger Origin Story You Never Heard; apparently based on the account of Jean de Joinville who was born a few years after Genghis Khan’s death). The Mongols introduced it to Russia, in what was called steak tartare (Tartars being one of the ethnic groups in the Mongol army), the Russians introduced it to Germany where it was most famously called hamburg steak (because sailors were served it at the ports of Hamburg), from which it was introduced to the United States by way of German immigrants sailing out of Hamburg. Another version of this is Salisbury steak that was invented during the American Civil War by Dr. James Henry Salisbury (physician, chemist, and medical researcher) as part of a meat-based, low-carb diet in medically and nutritionally treating certain diseases and ailments.
* * *
3/30/19 – An additional comment: I briefly mentioned sugar, that it causes a serotonin high and activates the hedonic pathway. I also noted that it was late in civilization when sources of sugar were cultivated and, I could add, even later when sugar became cheap enough to be common. Even into the 1800s, sugar was minimal and still often considered more as medicine than food.
To extend this thought, it isn’t only sugar in general but specific forms of it (Yu Hue, Fructose and glucose can regulate mammalian target of rapamycin complex 1 and lipogenic gene expression via distinct pathways). Fructose, in particular, has become widespread because of United States government subsidizing corn agriculture which has created a greater corn yield that humans can consume. So, what doesn’t get fed to animals or turned into ethanol, mostly is made into high fructose corn syrup and then added into almost every processed food and beverage imaginable.
Fructose is not like other sugars. This was important for early hominid survival and so shaped human evolution. It might have played a role in fasting and feasting. In 100 Million Years of Food, Stephen Le writes that, “Many hypotheses regarding the function of uric acid have been proposed. One suggestion is that uric acid helped our primate ancestors store fat, particularly after eating fruit. It’s true that consumption of fructose induces production of uric acid, and uric acid accentuates the fat-accumulating effects of fructose. Our ancestors, when they stumbled on fruiting trees, could gorge until their fat stores were pleasantly plump and then survive for a few weeks until the next bounty of fruit was available” (p. 42).
That makes sense to me, but he goes on to argue against this possible explanation. “The problem with this theory is that it does not explain why only primates have this peculiar trait of triggering fat storage via uric acid. After all, bears, squirrels, and other mammals store fat without using uric acid as a trigger.” This is where Le’s knowledge is lacking for he never discusses ketosis that has been centrally important for humans unlike other animals. If uric acid increases fat production, that would be helpful for fattening up for the next starvation period when the body returned to ketosis. So, it would be a regular switching back and forth between formation of uric acid that stores fat and formation of ketones that burns fat.
That is fine and dandy under natural conditions. Excess fructose on a continuous basis, however, is a whole other matter. It has been strongly associated with metabolic syndrome. One pathway of causation is that increased production of uric acid. This can lead to gout (wrongly blamed on meat) but other things as well. It’s a mixed bag. “While it’s true that higher levels of uric acid have been found to protect against brain damage from Alzheimer’s, Parkinson’s, and multiple sclerosis, high uric acid unfortunately increases the risk of brain stroke and poor brain function” (Le, p. 43).
The potential side effects of uric acid overdose are related to other problems I’ve discussed in relation to the agricultural mind. “A recent study also observed that high uric acid levels are associated with greater excitement-seeking and impulsivity, which the researchers noted may be linked to attention deficit hyperactivity disorder (ADHD)” (Le, p. 43). The problems of sugar go far beyond mere physical disease. It’s one more factor in the drastic transformation of the human mind.
* * *
4/2/19 – More info: There are certain animal fats, the omega-3 fatty acids EPA and DHA, that are essential to human health (Georgia Ede, The Brain Needs Animal Fat). These were abundant in the hunter-gatherer diet. But over the history of agriculture, they have become less common.
This is associated with psychiatric disorders and general neurocognitive problems, including those already mentioned above in the post. Agriculture and industrialization have replaced these healthy lipids with industrially-processed seed oils that are high in linoleic acid (LA), an omega-6 fatty acids. LA interferes with the body’s use of omega-3 fatty acids. Worse still, these seed oils appear to not only alter gene expression (epigenetics) but also to be mutagenic, a possible causal factor behind conditions like autism (Dr. Catherine Shanahan On Dietary Epigenetics and Mutations).
“Biggest dietary change in the last 60 years has been avoidance of animal fat. Coincides with a huge uptick in autism incidence. The human brain is 60 percent fat by weight. Much more investigation needed on correspondence between autism and prenatal/child ingestion of dietary fat.”
~ Brad Lemley
The agricultural diet, along with a drop in animal foods, saw a loss of access to the high levels and full profile of B vitamins. As with the later industrial seed oils, this had a major impact on genetics:
“The phenomenon wherein specific traits are toggled up and down by variations in gene expression has recently been recognized as a result of the built-in architecture of DNA and dubbed “active adaptive evolution.” 44
“As further evidence of an underlying logic driving the development of these new autism-related mutations, it appears that epigenetic factors activate the hotspot, particularly a kind of epigenetic tagging called methylation. 45 In the absence of adequate B vitamins, specific areas of the gene lose these methylation tags, exposing sections of DNA to the factors that generate new mutations. In other words, factors missing from a parent’s diet trigger the genome to respond in ways that will hopefully enable the offspring to cope with the new nutritional environment. It doesn’t always work out, of course, but that seems to be the intent.”
~Catherine Shanahan, Deep Nutrition, p. 56
And one last piece of evidence on the essential nature of animal fats:
“Maternal intake of fish, a key source of fatty acids, has been investigated in association with child neurodevelopmental outcomes in several studies. […]
“Though speculative at this time, the inverse association seen for those in the highest quartiles of intake of ω-6 fatty acids could be due to biological effects of these fatty acids on brain development. PUFAs have been shown to be important in retinal and brain development in utero (37) and to play roles in signal transduction and gene expression and as components of cell membranes (38, 39). Maternal stores of fatty acids in adipose tissue are utilized by the fetus toward the end of pregnancy and are necessary for the first 2 months of life in a crucial period of development (37). The complex effects of fatty acids on inflammatory markers and immune responses could also mediate an association between PUFA and ASD. Activation of the maternal immune system and maternal immune aberrations have been previously associated with autism (5, 40, 41), and findings suggest that increased interleukin-6 could influence fetal brain development and increase risk of autism and other neuropsychiatric conditions (42–44). Although results for effects of ω-6 intake on interleukin-6 levels are inconsistent (45, 46), maternal immune factors potentially could be affected by PUFA intake (47). […]
“Our results provide preliminary evidence that increased maternal intake of ω-6 fatty acids could reduce risk of offspring ASD and that very low intakes of ω-3 fatty acids and linoleic acid could increase risk.”
~Kristen Lyall et al, Maternal Dietary Fat Intake in Association With Autism Spectrum Disorders
* * *
6/13/19 – About the bicameral mind, I saw some other evidence for it in relationship to fasting. In the following quote, it is described that after ten days of fasting ancient humans would experience spirits. One thing for certain is that one can be fully in ketosis in three days. This would be true even if it wasn’t total fasting, as the caloric restriction would achieve the same end.
The author, Michael Carr, doesn’t think fasting was the cause of the spirit visions, but he doesn’t explain the reason(s) for his doubt. There is a long history of fasting used to achieve this intended outcome. If fasting was ineffective for this purpose, why has nearly every known traditional society for millennia used such methods? These people knew what they were doing.
By the way, imbibing alcohol after the fast would really knock someone into an altered state. The body becomes even more sensitive to alcohol when in ketogenic state during fasting. Combine this altered state with ritual, setting, cultural expectation, and archaic authorization. I don’t have any doubt that spirit visions could easily be induced.
Reflections on the Dawn of Consciousness
ed. by Marcel Kuijsten
Kindle Location 5699-5718
Chapter 13
The Shi ‘Corpse/ Personator’ Ceremony in Early China
by Michael Carr
“”Ritual Fasts and Spirit Visions in the Liji” 37 examined how the “Record of Rites” describes zhai 齋 ‘ritual fasting’ that supposedly resulted in seeing and hearing the dead. This text describes preparations for an ancestral sacrifice that included divination for a suitable day, ablution, contemplation, and a fasting ritual with seven days of sanzhai 散 齋 ‘relaxed fasting; vegetarian diet; abstinence (esp. from sex, meat, or wine)’ followed by three days of zhizhai 致 齋 ‘strict fasting; diet of grains (esp. gruel) and water’.
“Devoted fasting is inside; relaxed fasting is outside. During fast-days, one thinks about their [the ancestor’s] lifestyle, their jokes, their aspirations, their pleasures, and their affections. [After] fasting three days, then one sees those [spirits] for whom one fasted. On the day of the sacrifice, when one enters the temple, apparently one must see them at the spirit-tablet. When one returns to go out the door [after making sacrifices], solemnly one must hear sounds of their appearance. When one goes out the door and listens, emotionally one must hear sounds of their sighing breath. 38
“This context unequivocally uses biyou 必 有 ‘must be/ have; necessarily/ certainly have’ to describe events within the ancestral temple; the faster 必 有 見 “must have sight of, must see” and 必 有 聞 “must have hearing of, must hear” the deceased parent. Did 10 days of ritual fasting and mournful meditation necessarily cause visions or hallucinations? Perhaps the explanation is extreme or total fasting, except that several Liji passages specifically warn against any excessive fasts that could harm the faster’s health or sense perceptions. 39 Perhaps the explanation is inebriation from drinking sacrificial jiu 酒 ‘( millet) wine; alcohol’ after a 10-day fast. Based on measurements of bronze vessels and another Liji passage describing a shi personator drinking nine cups of wine, 40 York University professor of religious studies Jordan Paper calculates an alcohol equivalence of “between 5 and 8 bar shots of eighty-proof liquor.” 41 On the other hand, perhaps the best explanation is the bicameral hypothesis, which provides a far wider-reaching rationale for Chinese ritual hallucinations and personation of the dead.”
This made my mind immediately wonder how this relates. Changes in diets alter hormonal functioning. Endocrinology, the study of hormones, has been a major part of the diet debate going back to European researchers from earlier last century (as discussed by Gary Taubes). Diet affects hormones and hormones in turn affect diet. But I had something more specific in mind.
What about propionate and glutamate? What might their relationship be to testosterone? In a brief search, I couldn’t find anything about propionate. But I did find some studies related to glutamate. There is an impact on the endocrine system, although these studies weren’t looking at the results in terms of autism specifically or neurocognitive development in general. It points to some possibilities, though.
One could extrapolate from one of these studies that increased glutamate in the pregnant mother’s diet could alter what testosterone does to the developing fetus, in that testosterone increases the toxicity of glutamate which might not be a problem under normal conditions of lower glutamate levels. This would be further exacerbated during breastfeeding and later on when the child began eating the same glutamate-rich diet as the mother.
11/28/21 – Here is some discussion of vitamin B1 (thiamin/thiamine). It couldn’t easily fit into the above post without revising and rewriting some of it. And it could’ve been made into a separate post by itself. But, for the moment, we’ll look at some of the info here, as relevant to the above survey and analysis. This section will be used as a holding place for some developing thoughts, although we’ll try to avoid getting off-topic in a post that is already too long. Nonetheless, we are going to have to trudge a bit into the weeds so as to see the requisite details more clearly.
Related to autism, consider this highly speculative hypothesis: “Thiamine deficiency is what made civilization. Grains deplete it, changing the gut flora to make more nervous and hyperfocused (mildly autistic) humans who are afraid to stand out. Conformity. Specialization in the division of labor” (JJ, Is Thiamine Deficiency Destroying Your Digestive Health? Why B1 Is ESSENTIAL For Gut Function, EONutrition). Thiamine deficiency is also associated with delirium and psychosis, such as schizophrenia (relevant scientific papers available are too numerous to be listed). By the way, psychosis, along with mania, has an established psychological and neurocognitive overlap with measures of modern conservatism; in opposition to the liberal link to mood disorders, addiction, and alcoholism (Uncomfortable Questions About Ideology; & Radical Moderates, Depressive Realism, & Visionary Pessimism). This is part of some brewing thoughts that won’t be further pursued here.
The point is simply to emphasize the argument that modern ideologies, as embodied worldviews and social identities, may partly originate in or be shaped by dietary and nutritional factors, among much else in modern environments and lifestyles. Nothing even comparable to conservatism and liberalism existed as such prior to the expansion and improvement of agriculture during the Axial Age (farm fields were made more uniform and well-managed, and hence with higher yields; e.g., systematic weeding became common as opposed to letting fields grow in semi-wild state); and over time there were also innovations in food processing (e.g., removing hulls from grains made them last longer in storage while having the unintended side effect of also removing a major source of vitamin B1 to help metabolize carbs).
In the original writing of this post, one focus was on addiction. Grains and dairy were noted as sources of exorphins and dopaminergic peptides, as well as propionate and glutamate. As already explained, this goes a long way to explain the addictive quality of these foods and their relationship to the repetitive behavior of obsessive-compulsive disorder. This is seen in many psychiatric illnesses and neurocognitive conditions, including autism (Derrick Lonsdale et al, Dysautonomia in Autism Spectrum Disorder: Case Reports of a Family with Review of the Literature):
“It has been hypothesized that autism is due to mitochondrial dysfunction [49], supported more recently [50]. Abnormal thiamine homeostasis has been reported in a number of neurological diseases and is thought to be part of their etiology [51]. Blaylock [52] has pointed out that glutamate and aspartate excitotoxicity is more relevant when there is neuron energy failure. Brain damage from this source might be expected in the very young child and the elderly when there is abnormal thiamine homeostasis. In thiamine-deficient neuroblastoma cells, oxygen consumption decreases, mitochondria are uncoupled, and glutamate, formed from glutamine, is no longer oxidized and accumulates [53]. Glutamate and aspartate are required for normal metabolism, so an excess or deficiency are both abnormal. Plaitakis and associates [54] studied the high-affinity uptake systems of aspartate/glutamate and taurine in synaptosomal preparations isolated from brains of thiamine-deficient rats. They concluded that thiamine deficiency could impair cerebellar function by inducing an imbalance in its neurotransmitter systems.”
We’ve previously spoken of glutamate, a key neurotransmitter; but let’s summarize it while adding in new info. Among those on the autistic spectrum, there is commonly a glutamate excess. This is caused by eating a lot of processed foods that use glutamate as an additive (e.g., MSG). And there is the contributing factor of many autistics being drawn to foods naturally high in glutamate, specifically dairy and wheat. A high-carb diet also promotes the body’s own production of glutamate, with carb-related inflammation spiking glutamate levels in the brain; and it downregulates the levels of the inhibitory neurotransmitter GABA that balances glutamate. GABA is important for sleep and much else.
Keep in mind that thiamine is required in the production of numerous other neurotransmitters and required in the balanced interaction between them. Another B vitamin, B12 (cobalamin), plays a similar role; and it deficiency is not uncommonly seen there as well. The B vitamins, by the way, are particularly concentrated in animal foods, as are other key nutrients. Think about choline, precursor of acetylecholine, that promotes sensory habituation, perceptual regulation, attentional focus, executive function, and selective responsiveness while supporting mental flexibility (thiamine is also needed in making acetylcholine, and notably choline has some similarities to B vitamins); while similarly the amino acid L-tyrosine further promotes mental flexibility — the two form a balance of neurocognitive functioning, both of which can be impaired in diverse psychiatric diseases, neurological conditions, speech/language issues, learning disabilities, etc.
There is way too much scientific evidence to be cited and surveyed here, but let’s briefly focus in on some examples involving choline, such an easily found nutrient in eggs, meat, liver, and seafood. Studies indicate choline prevents mental health issues like schizophrenia and ADHD that involve sensory inhibition and attention problems that can contribute to social withdrawal (Bret Stetka, Can Mental Illness Be Prevented In The Womb?). Autism spectrum disorders and mood disorders, in being linked to choline deficiency, likewise exhibit social withdrawal. In autism, the sensory inhibition challenge is experienced as sensory overload and hyper-sensitivity (Anuradha Varanasi, Hypersensitivity Might Be Linked To A Transporter Protein Deficiency In The Brain: Study).
Mental flexibility, specifically, seems less relevant to modern society or rather, maybe it’s suppression, has made possible the rise of modern society; as hyper-specialization has become central for most modern work that is narrowly focused and repetitive. Yet one might note that modern liberalism strongly correlates with mental flexibility; e.g., Ernest Hartmann’s fluid and thin boundaries of mind, Big Five’s trait of openness to experience, and Myers-Briggs intuition and perceiving — by the way, a liberal arts education is defined by its not being specialized, and that is precisely what makes it ‘liberal’ (i.e., generous, expansive, inclusive, diverse, tolerant, multiperspectival, etc).
Maybe this also relates to how modern liberalism, as an explicit socio-ideological identity, has typically been tied into the greater wealth of the middle-to-upper classes and hence involving greater access to nutritious foods and costly supplements, not to mention high quality healthcare that tests for nutritional deficiencies and treats them early on; along with higher status, more privileges, and less stress within the high inequality hierarchy of the American caste system. There is a significant amount of truth to the allegation about a ‘liberal elite’, which in some ways applies to the relatively more liberal-minded conservative elites as well. It would be interesting to know if malnutrition or specific nutritional deficiencies increases social conservatism, similar to studies that have proven a link between parasite load and authoritarianism (in this blog, it’s been pointed out that all authoritarianism is socially conservative, not only the likes of Nazis but also Soviets, Maoists, and others; all of which targeted social liberals and those under the protection of socially liberal society).
Many other factors can exacerbate the delicate system. To return to glutamate, one of three precursors in producing the endogenous antioxidant glutathione. A major limit to this process is glycine that primarily comes from the connective tissue of animal foods (tough meats, gristle, bone broths, etc). Without sufficient glycine, glutamate won’t get used up and so will accumulate. Plus, glycine directly interacts with the glutaminergic neurotransmission system and so is needed for healthy functioning of glutamate. Further complicating can be mercury toxicity that over-excites the glutamate pathway. Then, as already described, the modern diet dumps even more glutamate on the fire. It’s a whole freaking mess, the complex and overlapping conditions of modernity. Altering any single factor would throw a wrench into the works, but what we’re talking about is nearly every major factor along with many minor factors all being tossed up in the air.
The standard American diet is high in refined carbs while low in certain animal-based nutrients that were more typical on a traditional nose-to-tail diet. About the first part, refined carbs are low in vitamin B1 (thiamin/thiamine), but governments have required fortification of such key nutrients. The problem is that thiamine is required for metabolism of carbs. The more carbs one eats the more thiamine that is needed. Carb intake has risen so vastly that, as some argue, the levels of fortification aren’t enough. To make matters worse, because thiamine deficiency causes carb metabolism disruption, there is an increasing craving for carbs as the body struggles to get the fuel it needs. Then, as those cravings lead to continued overeating of carbs, thiamine deficiency gets worse which makes the carb cravings even stronger. It becomes a lifelong addiction, in some cases involving alcoholism as liquid carbs (the body treats alcohol the same as sugar).
The only alternative fuel for the body is fat. Here we get to another wrinkle. A high-carb diet also causes insulin resistance. The hormone insulin, like thiamine, is also needed in energy metabolism. This often leads to obesity where excess calories get stored as fat but, without insulin sensitivity, the body can’t easily access that stored energy. So, this is why fat people are constantly hungry, despite having immense stored energy. Their bodies can’t fully use that stored energy and neither can their bodies fully use the carbs they’re eating. Thiamine deficiency combined with insulin resistance is a spiral of metabolic dysfunction. This is why some experts in this field worry that thiamine insufficiency might be greater than acknowledged and that it might not show up on standard tests, as what is not being considered is the higher demand for thiamine with a higher intake of carbs that has ever before existed. To further obscure this health crisis, it is irrelevant how much thiamine a test shows in one’s bloodstream, if one lacks the cofactors (e.g., magnesium) to help the body process thiamine and transport it into cells.
Insulin resistance, along with the rest of metabolic syndrome, has many neurological consequences. Numerous neurocognitive conditions are directly linked to it and often involve thiamine deficiency — besides autism: mood disorders, obsessive-compulsive disorder, schizophrenia, etc. For example, consider Alzheimer’s that some are now referring to as type III diabetes because there is insulin resistance in the brain; and the brain requires glucose that in turn requires insulin and insulin sensitivity. All cells need energy and this goes to the centrality of the mitochondria, the powerhouses of cellular energy (each cell can have thousands of mitochondria). Besides autoimmune conditions like multiple sclerosis, mitochondrial dysfunction might also involved in conditions like autism. That is related to thiamine deficiency causing energy deficiency and affects the role of glutamate.
It’s a morass of intertwining mechanisms, pathways, and systems that are hard for a laymen to comprehend. But it is serious stuff on so many levels, for individuals and society. For a moment, let’s step back and look again at the big picture. In The Crisis of Identity, public health was explained as a moral panic and existential crisis. One aspect that wasn’t explored in that post is cancer, but we did briefly note that, “in the mid-1800s, Stanislas Tanchou did a statistical analysis that correlated the rate of grain consumption with the rate of cancer; and he observed that cancer, like insanity, spread along with civilization.” We only bring this up now because we’ve been reading Sam Apple’s book Ravenous that is about the Nazi obsession about cancer with the same mass hysteria as was going on elsewhere in the Western world, such as with neurasthenia and tuberculosis; and bringing up antisemitism everywhere it was found.
Cancer, though, can help us understand an aspect of thiamine deficiency and insufficiency. It also has to do with neurological and mental health. In interfering with carb metabolism, insufficient thiamine also interferes with mitochondrial oxidation and so mitochondria turn to fermenting glucose for energy. This is what happens in cancer cells, as the Jewish-Nazi scientist Otto Warburg thought so important. In general, mitochondrial dysfunction results and energy production goes down. Also, the mitochondria are closely related to immune functioning and so autoimmune disorders can follow: multiple sclerosis, Hashimoto’s, rheumatoid arthritis, etc. Along with causing gut issues and a diversity of other symptoms, this is why thiamine deficiency is known as a disease mimic in so often getting misdiagnosed as something else.
That is a problem with something like psychiatric categories and labels, as they are simply groupings of symptoms; but then again that is true for most conventional healthcare. We need to discern the underlying cause(s). To demonstrate this, we’ll now move on to the limbic system that is part of the primitive brain stem, having to do with emotional processing and control of the autonomic nervous system. Thiamine deficiency have a strong impact on limbic cells, similar to an oxygen deficiency because of the aforementioned altered energy metabolism of mitochondria that prioritizes oxygen in production of ATP (the main fuel used by most cells). There is not only a loss of energy but eventually mitochondrial death and hence cell death, also from decreased glucose utilization in cells; or, in some cases, something worse when cells refuse to die (i.e., cancer) in turning to glucose fermentation in mitochondria that allows those cells to proliferate. In either case, the involvement of carbs and glucose becomes dramatically changed and imbalanced.
This points to how the same fundamental issues deep within our physiology can become expressed in numerous ways, such as the link between cancer and metabolic syndrome (particularly obesity). But, in terms of subjective experience, we can’t realize most of this is going on and even doctors often aren’t able to detect it with the crude tools at hand. Yet the individual might experience the consequences of what can’t be seen. If thiamine deficiency causes brain damage in the limbic system and elsewhere, the results can be depression, anxiety, irritability, fatigue, bipolar, emotional instability, moodiness, confusion, schizophrenia, cognitive decline, learning difficulties, inability to form memories, loss of memory recall, confabulation (making up stories), etc; with the worse symptoms corresponding to Wernicke-Korsakoff syndrome. And can ultimately (and very rapidly) etc. Now multiply that across an entire society and no wonder the reactionary mind has taken hold and created such a powerful psychological undertow, not only for conservatives but for everyone.
* * *
6/2/22 – Let’s make yet another subsection to throw in some other info. This is an extension of what has already been said on the growing number of factors involved in autism spectrum disorder, not to mention often overlapping with numerous other physical and cognitive conditions. There are so many proven and potential factors (correlated, contributing, and causal) that it can give one a headache trying to piece it all together and figure out what it means. Writing about it here is nearly headache-inducing, and so empathy goes out to any readers trying to work their way through this material. Such diverse and wide-ranging evidence might imply that so-called autism spectrum disorder is not really a single disorder but a blanket label to cover up mass complexity and confusion. Okay. Take a deep breath.
An interesting substance is carnitine that is needed for energy production by helping transport fatty acids into the mitochondria. Low carnitine levels are prevalent in certain neurocognitive conditions, from depression to autism. “Some tenuous links between carnitine and autism already exist. Defects in the mitochondria, which have previously been linked to autism, can sometimes lead to carnitine deficiency. And treating children with autism with valproic acid, an anti-seizure medicine that can lower carnitine levels, can have serious side effects” (Emily Singer, Defects in carnitine metabolism may underlie autism). It’s one of the many nutrients that is mostly found in or entirely exclusive to animal foods, and so having much to do with the agricultural diet and even more so in terms of modern industrial food production. For such an easily obtained substance, there is a significant number of Westerners who are not getting enough of it. But all they’d need to do to obtain it is eat some red meat, which is precisely the main food that health experts and public officials have been telling Americans to avoid.
Beef consumption is almost half of what it was at the beginning of the 19th century and has leveled out since then, whereas low-carnitine meats such as chicken and fish have increasingly replaced beef. About the agricultural angle, it might be noted that grain-fed animals have lower amounts of diverse nutrients (carnitine, choline, CoQ10, zinc, carotenoids, vitamin A3, E vitamins, omega-3s, etc) as compared to pasture-raised and wild-caught animals; except with certain nutrients that are typically added to animal feed — and this might partly explain why the agricultural revolution led to increased stunting and sickliness, many thousands of years before the modern industrialized diet of hyper-processed foods produced from industrial agriculture. So, it’s not only that modern Americans are eating less red meat but replacing such nutrient-density with lower quality animal foods from factory farming; while overall meat consumption has dropped since the 19th century, along with animal fat intake having drastically declined after being mostly replaced with industrial seed oils by the 1930s. It’s safe to say that the average American is consuming approximately zero fatty ruminant meat or any other animal foods from pasture-raised or wild-caught animals. Yet the intake of vegetables, fruits, nuts, seeds, and seed oils is greater than past centuries.
To refocus, the human body has some capacity to produce carnitine de novo, but it’s limited and far from optimal. Autistics, in particular, can have carnitine-related genetic defects with a deletion in the gene trimethyllysine hydroxylase epsilon (TMLHE); a genetic effect that is mostly found in families with multiple autistic boys. Also, as expected, vegans and vegetarians measure as having low plasma levels of this key nutrient. Such deficiencies are potentially a worse problem for certain modern populations but less so in the past because “genetic deficiencies in carnitine synthesis were tolerated in the European population because their effects were nutritionally complemented by a carnitine-rich diet. In this manner, the selection pressures that would have otherwise eliminated such mutations from the population were effectively removed” (Vytas A. Bankaitis & Zhigang Xie, The neural stem cell/carnitine malnutrition hypothesis: new prospects for effective reduction of autism risk?). As for the present, the authors “estimate that some 20%–30% of pregnant women in the United States might be exposing the developing fetus to a suboptimal carnitine environment.”
Carnitine underpins many physiological factors and functions involving embryonic neural stem cells, long-chain fatty acids, mitochondrial function, ATP production, oxidative stress, inflammation, epigenetic regulation of gene expression, etc. As mediated by epigenetic control, carnitine promotes “the switch from solitary to gregarious social behavior” in other species and likely in humans as well (Rui Wu et al, Metabolomic analysis reveals that carnitines are key regulatory metabolites in phase transition of the locusts). Certainly, as Bankaitis and Xie explains, carnitine is directly correlated to language/speech delay, language weakness, or speech deficits along with stunted motor development and common autistic behaviors that are causally linked by way of long-chain fatty acid (LCFA) β-oxidation deficits, medium-chain FAO deficits, etc. To emphasize this point, overlapping with the same deficiencies (carnitine, B vitamins, fat-soluble vitamins, choline, etc) and excesses (glutamate, propionate, etc) as found in autism, there are many other speech and language conditions: dyslexia, specific language impairment (SLI), developmental language disorder (DLD), etc; along with ADHD, learning disabilities, and much else (about all of this, approximately a million studies have been done and another million articles written) — these might not always be entirely distinct categories but imperfect labels for capturing a swarm of underlying issues, as has been suggested by some experts in the field.
To worsen these problems are toxins: “Exposure of a pregnant woman to high levels of heavy metals in drinking water or otherwise also carries the risk of impairing de novo carnitine biosynthesis.” In the main text of this post, there was much exploration of glutamate (e.g., MSG) as a neurotoxin. On a related note, acetyl-L-carnitine (ALCAR or LAC) “supplements ameliorate depressive symptoms in mice by reversing brain-cell impairment caused by an excess of glutamate” (Bruce S. McEwen, Lack of a single molecule may indicate severe and treatment-resistant depression; see: Carla Nasca et al, Acetyl-L-carnitine deficiency in patients with major depressive disorder). A similar protective role is found with other “compounds containing a trimethylamine group (carbachol, betaine, etc.)” (Marta Llansola et al, Prevention of ammonia and glutamate neurotoxicity by carnitine: molecular mechanisms). Furthermore, “L-carnitine can protect from Hepatotoxic, neurotoxic, renal impairment and genotoxic effects functionally, biochemically and histopathologically with a corresponding reduction of oxidative stress” (Krishna Murthy Meesala & Pratima Khandayataray, Monosodium Glutamate Toxicity and the Possible Protective Role of L–Carnitine). It’s fascinating that one set of toxins, heavy metals, would interfere with carnitine levels when carnitine is needed to deal with other toxins, glutamate and ammonia.
Bankaitis and Zhigang Xie then conclude that, “Finally, we are struck by the fact that two developments dominating public interest in contemporary news cycles detail the seemingly unrelated topics of the alarming rise of autism in young children and the damaging human health and planetary-scale environmental costs associated with cattle farming and consumption of red meat (86.). The meteoric rise of companies promoting adoption of meatless mimetics of beef and chicken at major fast food outlets testifies to the rapidly growing societal appetite for reducing meat consumption. This philosophy is even rising to the level of circulation of scientific petitions exhorting world governments to unite in adopting global measures to restrict meat consumption (87). We now pose the question whether such emerging societal attitudes regarding nutrition and its environmental impact are on collision course with increased ASD risk. Food for thought, indeed.” It’s been shown that mothers of autistic children ate less meat before conception, during pregnancy, or during lactation period; and had lower levels of calcium (Ya-Min Li, Maternal dietary patterns, supplements intake and autism spectrum disorders). Sure, we could supplement carnitine and every other nutrient concentrated in meat. That certainly would help bring the autism rate back down again (David A. Geier et al, A prospective double-blind, randomized clinical trial of levocarnitine to treat autism spectrum disorders). But maybe, instead, we should simply emphasize a healthy diet of nutrient-dense animal foods, particularly as whole foods.
It might be about finding the right form in the right amount, maybe in the needed ratio with other nutrients — our partial knowledge and vast ignorance being the eternal problem (Hubris of Nutritionism); whereas animal foods, particularly pasture-raised and wild-caught, have all of the nutrients we need in the forms, amounts, and ratios we need them. As clever monkeys, we’ve spent the past century failing in our endeavor to industrially and medically re-create the wheel that Mother Nature invented through evolution. To put this in context of everything analyzed here in this unwieldy piece, if most modern people weren’t following a nutritionally-deficient agricultural diet largely consisting of industrially hyper-processed and fortified plant foods, nearly all of the scientific disagreement and debate would be irrelevant. We’ve painted ourselves into a corner. The fact of the matter is we are a sickly people and much of that is caused by diet, although not limited to micronutrients or whatever as the macronutrients play a particular role in metabolic health or lack thereof which in turn is another contributing factor to autism (Alison Jean Thomas, Is a Risk of Autism Related to Nutrition During Pregnancy?). And metabolic dysfunction and disease has much to do with addictive and/or harmful overconsumption of agricultural foods like grains, potatoes, sugar cane, high fructose corn syrup, seed oils, etc.
For vitamin B9, some speculate that increased risk of autism might have to do with methylation defects caused by mutations in the MTHFR gene (A1298C and C667T) or even possibly mimicking this phenomenon for those without it (Karen E Christensen, High folic acid consumption leads to pseudo-MTHFR deficiency, altered lipid metabolism, and liver injury in mice). This relates to a reason behind recommendations for methylated forms of B vitamins; which is a good source of methyl groups required for various physiological functions. For example, in demonstrating how one thing leads to another: “The methyl group from methyl folate is given to SAMe, whose job it is to deliver methyl to 200 essential pathways in the body. […] After receiving methyl donors, SAMe delivers methyl to 200 pathways in the body including ones needed to make carnitine, creatine and phosphotidylcholine. Carnitine supplementation improves delivery of omega 3 & 6 fatty acids needed to support language, social and cognitive development. Phosphatidylcholine is important in cell membrane health and repair. […] Repair of the cell membrane is an important part of improving sensory issues and motor planning issues in children with autism, ADHD and sensory integration disorder. Dimethylglycine (DMG) and trimethylglycine (TMG) donate methyl groups to the methylation cycle. TMG is needed to recycle homocysteine and help produce SAMe” (Treat Autism, Autism and Methylation – Are you helping to repair your child’s methylation cycle?).
Others dismiss these skeptical concerns and alternative theories as pseudo-scientific fear-mongering. The debate began with a preliminary study done in 2016; and, in the following year, a published review concurred that, “Based on the evidence evaluated, we conclude that caution regarding over supplementing is warranted” (Darrell Wiens & M. Catherine DeSoto, Is High Folic Acid Intake a Risk Factor for Autism?—A Review). There are other issues, besides that. There has been a quarter century of mass supplementation of folate with fortified foods, but there apparently never was done any safety studies or analysis for the general population. On top of that, phthalate exposure from plastic contamination in water and such disrupts genetic signals for the processing of folate (Living On Earth, Plastics Linked to Rising Rates of Autism). But supplementation of folic acid might compensate for this (Nancy Lemieux, Study reports link between phthalates and autism, with protective effects of folic acid). The breakdown of plastic into microplastic can accumulate in biological tissue that humans consume, if unsure the same is true in plants and if unsure how much phthalates can accumulate up the food chain. So, it’s not clear how this how this may or may not be a problem specifically within present agriculture, but one suspects it might be an issue. Certainly, the majority of water in the world now is contaminated by microplastics and much else; and that water is used for livestock and agricultural goods. It’s hard to imagine how such things couldn’t be getting into everything or what it might mean for changes in the human body-mind, as compounded by all the rest (e.g., how various substances interact within the body). About pesticides in the water or from other sources, one might note that folic acid may have a protective effect against autism (Arkansas Folic Acid Coalition, Folic Acid May Reduce Autism Risk from Pesticides).
Whatever it all means, it’s obvious that the B vitamins are among the many super important nutrients mostly found in animal foods and concentrated in highest amounts in the most quality sources from animals grown on pasture or in the wild. Much of the B vitamin debate about autism risk is too complex and murky to further analyze here, not to mention to mixed up with confounders and replication crisis; with one potential confounder being the birth order effect or stoppage effect (Gideon Koren, High-Dose Gestational Folic Acid and the Risk for Autism? The Birth Order Effect). As one person noted, “If the literature is correct, and folic acid really causes a 42% reduction in autism, we should see a sharp decrease in autism diagnosis for births starting in 1997. Instead, autism rates continued to increase at exactly the same rate they had before. There is nothing in the data to suggest even a small drop in autism around the time of folic acid fortification” (Chris Said, Autism, folic acid, and the trend without a blip). And elsewhere it’s recently stated that, “The overall evidence for all these claims remains inconclusive. While some meta-analyses have found a convincing pattern, a comprehensive 2021 Nutrients review failed to find a “robust” statistical association — a more definitive outcome in the field of epidemiology“ (Molly Glick, A Popular Supplement’s Confusing Links With Autism Development). That same assessment is repeated by others: “Studies have pointed out a potential beneficial effect of prenatal folic acid maternal supplementation (600 µg) on the risk of autism spectrum disorder onset, but opposite results have been reported as well” (Bianka Hoxha et al, Folic Acid and Autism: A Systematic Review of the Current State of Knowledge). It doesn’t add up, but we won’t attempt to solve that mystery.
To further muck up the works, it’s amusing that some suggest a distinction be made: “The signs and symptoms of pediatric B12 deficiency frequently mimic those of autism spectrum disorders. Both autistic and brain-injured B12– deficient children have obsessive-compulsive behaviors and difficulty with speech, language, writing, and comprehension. B12 deficiency can also cause aloofness and withdrawal. Sadly, very few children presenting with autistic symptoms receive adequate testing for B12 deficiency” (Sally M. Pacholok, Pediatric Vitamin B12 Deficiency: When Autism Isn’t Autism). Not being alone in that claim, someone else said, “A vitamin B12 deficiency can cause symptoms and behaviours that sometimes get wrongly diagnosed as autism” (). That second person’s motivation was to deny the culpability of veganism: “Vegans and vegetarians often struggle to get sufficient levels of B12 in their diets. Therefore the children of pregnant vegans may be more likely to have B12 deficiency.” But also that, “Early research shows that many genuinely autistic people have excessive levels of B12 in their systems. […] Vegans are more like likely to take supplements to boost the vitamins they lack in their diet, including B12.” A deficiency in early life and a compensatory excess in later life could both be tied into vegan malnourishment — maybe or maybe not. Apparently, however explained or else rationalized away, just because something looks like duck, walks like a duck, and quacks like a duck doesn’t necessarily mean it’s actually a duck. But has the autistic label ever been anything other than a constellation of factors, symptoms, behaviors, and traits? It’s like asking if ‘depression’ variously caused by stress, overwork, sleep deprivation, trauma, nutritional deficiency, toxicity, parasitism, or physical disease really are all the same mental illness. Admittedly, that is a useful line of thinking, from the perspective of functional medicine that looks for underlying causes and not mere diagnoses for the sake of insurance companies, bureaucratic paperwork, and pharmaceutical prescriptions.
Anyway, let’s just drop a load of links for anyone who is interested to explore it for themselves:
However, it’s unclear whether this gender bias is the result of genetics or reflects differences in diagnosis or the way females manifest symptoms of the disorder. Girls with autism tend to actively compensate for their symptoms in ways that boys don’t, which may account for the discrepancy, says Skuse.
As a result, the females enrolled in studies may tend to be severely affected and carry multiple mutations. “There is some suggestion that higher-functioning females are out there in the general population, but they’re not being referred,” he says.
Here is what one could argue. Maybe it is most likely that the bias is not just in diagnosis for there would be a directly related bias in the research itself. After all, it is diagnosis that determines the subjects in the autism studies. So, if diagnosis is biased, there is no reason to assume that the subjects are representative of the full autistic population. Biased input would inevitably lead to biased results and hence biased conclusions. Basically, these studies at present might not be able to tell us anything about possible gender differences.
A reason given for the alleged failure to detect female autism is that “it may be because girls are better at masking the symptoms – better at copying social norms while not necessarily understanding them.” That might be true of many boys and men as well.
I have some Asperger’s-like traits, although I’ve never been diagnosed. Maybe it’s because I learned to fit in. I was socially clueless when younger and social situations stress me out, a set of factors exacerbated by my inner-focused nature. I don’t connect easily with others. But you wouldn’t notice that from casually interacting with me. I know how to pretend to be normal. It’s maybe why therapy has never worked for me, as I’ve developed a habit of effectively hiding my problems. It’s a survival mechanism that I learned young.
What occurs to me is that I’m a Jungian Feeling type. Myers-Briggs testing has found that most Feeling types are female, although about 30% are male. The same pattern in the opposite direction is seen with Thinking types. There is a general pattern that follows along gender lines. Still, that approximate third of the population is a significant number. That might mean that a third of male autistics don’t fit into the male pattern, maybe while a third of female autistics do.
So the seeming gender difference found in autism could be more about personality differences. And those personality differences may or may not be genetic in nature. Much of this could instead be culturally learned behavior. It wouldn’t only be cultural biases in diagnosis of autism for, if that is so, it would also be cultural biases in how autism is expressed. In that case, the question is what might be the relationship between culture, personality, gender, and neurocognitive development. There are obviously many complex factors involved, such as considering how a significant number of people don’t fall into simple gender categories: “It’s far from uncommon for people to carry genetics of both sexes, even multiple DNA.” Since gender isn’t binary, the expressions of autism presumably also wouldn’t be binary.
It would be easy to test my speculation if formulated as a hypothesis. My prediction would be that Thinking type females would be more likely to be diagnosed as autstic. And the opposite prediction would be that Feeling type males would be less likely. That is simply to say that autism would express differently depending on personality traits/functions. Similar research could be done with FFM/Big Five, and maybe such research already has been done. A related issue that would need to be disentangled is whether autism is more common among certain personalities or simply more diagnosed among certain personalities, an issue that could be studied either in relation to or separate from gender.
All of this is particularly complicated for certain Myers-Briggs types. My specific type is INFP. This type is one of the most talented types when it comes to masking behavior, “known as being inscrutable.” As Carl Jung described dominant introverted feeling (what Myers-Briggs divides into two types: INFP and ISFP):
They are mostly silent, inaccessible, hard to understand; often they hide behind a childish or banal mask, and their temperament is inclined to melancholy…Their outward demeanor is harmonious, inconspicuous…with no desire to affect others, to impress, influence or change them in any way. If this is more pronounced, it arouses suspicion of indifference and coldness…Although there is a constant readiness for peaceful and harmonious co-existence, strangers are shown no touch of amiability, no gleam of responsive warmth…It might seem on a superficial view that they have no feelings at all. (Psych. Types, Para. 640-641)
An INFP aspie would make for a rather confusing specimen. It is the dominant introverted feeling that is so hard to discern. And this introverted feeling is hidden behind the chameleon-like and outward-facing extraverted intuition, what is in the position called the auxiliary function. Extraverted intuition is the ultimate mask to hide behind, as it is highly fluid and adaptable. And as the auxiliary function, extraverted intuition plays the role of mediation with and defense against the outside world.
Maybe a significant number of autistics have hidden introverted feeling. This would fit the autistic pattern of feeling strongly in response to others (high functioning affective empathy) while not easily connecting to others (low functioning cognitive empathy). By its nature, there is no straightforward way for introverted feeling to be expressed in social behavior. Yet an INFP can be talented at learning normal social behavior, as extraverted intuition helps them to be mimics. Or failing that, they could stonewall anyone trying to figure them out. Usually being conflict avoidant, most dominant introverted feeling types will go along to get along, as long as no one treads on their core sense of value.
I think it’s a bit silly to make a distinction between “male” and “female” interests in the first place and realize that it can also be healthy for women to take interest in more traditionally “male” subjects such as science and technology and that doesn’t always mean that they have a disorder. In making a diagnosis they should always be aware of the underlying pattern rather than the actual interest and keep in mind that interests may differ for each individual, so (e.g.) whether a female is obsessively talking about computers or fashion should not matter, because the pattern is the same. Indeed, it probably is more obvious in the first case, especially when society is more geared toward male/female stereotyping [so “masculine” interests for women stand out]. And besides, narrow interests is but 1 clue, it doesn’t count for every individual with an ASD; they may have a range of interests, just as typical people do.
Also, as some typologists argue, the US has been an society dominated by ESTJ types that is becoming dominated by ENTJ types (John Giannini, Compass of the Soul). The commonality then is E_TJ, that is to say dominant extraverted thinking. This typological bias is how American culture defines and enforces the social norms of the male gender. Unsurprisingly, that would also be how autism gets diagnosed, according to extraversion and thinking.
On the other hand, autism that was introverted and/or feeling would express in entirely different ways. In particular, dominant introverted feeling would express as strong affective empathy, rather than the (stereotypically) masculine tendency toward emotional detachment. Also, introversion taken on its own, whether in relation to feeling or thinking, would be more internalized and hence less observable — meaning obsessions that would be unlikely to seen in outward behavior: more subtle and nuanced or else more hidden and masked.
This personality perspective might be immensely more helpful than using a gender lens alone. It’s also a more psychologically complex frame of interpretation, appealing to my personality predilections. Considering that autism and Asperger’s was originally observed and defined by men, one might wonder what kind of personalities they had. Their personalities might have determined which personalities they were drawn to in studying and hence drawn to in using as the standard for their early models of the autism spectrum.
I was reading Ungifted: Intelligence Redefined by Scott Barry Kaufman. I came across a section about Aspergers. The more I’ve read about it over the years the more I suspect that I have some form of it.*
A theory on Autism is that it is strong focus on details which can lead to not seeing the forest for the trees, but if high functioning enough this can be compensated for. The Aspie takes in so many details that this can lead to distraction and cognitive overload. There are two primary ways of dealing with this. First, Aspies might limit their interactions and narrow their focus to create a more manageable space in which to think and to feel more comfortable. Second, Aspies often learn to chunk information.
The second method is what I learned as a child when I was living in Deerfield, Illinois (a wealthy Jewish suburb of Chicago; more on this below). I was having trouble with reading and I stuttered. I had a hard time saying what a word was or even recalling the names of my friends, but I could describe what I meant when I wasn’t stuttering and the only reason I was stuttering was because I couldn’t recall.
I went to speech therapy, but even the therapist wasn’t sure my precise problem. This therapist and my mom, who also was a speech therapist, went to a talk given by Diane J. German from Northwestern University who maybe was working on her PhD dissertation at the time (my mom thinks this was in 1982 since I was diagnosed in first grade when I was 6 years old). She is now a professor emeritus at National Louis University. At the time, German was working on a new test for word recall issues. Here is an article about her work:
“The look on these children’s faces captures the problem in the most compelling way,” says Diane German, the principal researcher, who specializes in disorders of word-finding and a special education professor at National-Louis University in Chicago, Illinois. “They really struggle when they have to read a simple word like ‘nest’ out loud. Some grimace, others look stuck. Some just blurt out an answer that’s almost always wrong. Yet when asked to point to the same word on a page, they almost always get it right. Clearly they’ve got a problem and need help, but it’s not that they lack reading skills.”
One child in the study, previously diagnosed with these “word-finding” difficulties, couldn’t say “cocoon” as he tried to read a story aloud. When he got to the word, he stumbled and added, “You know, it is that brown thing hanging in the tree.”
“Clearly, this child had managed to ‘read’ the word to himself and comprehend it, or he could never have come up with that kind of description,” explains psychologist Rochelle Newman, co-author of the study and a University of Maryland professor of hearing and speech sciences. “He just couldn’t retrieve the sound pattern of the word.”
They immediately recognized that German was talking about my issues. German was looking to do a study. So, my mom did some of the testing for German’s study, but my mom recalls German coming to our house and testing me herself. That is how I became one of the kids used as a subject in her study. And that was the beginning of how I, unlike so many other kids, escaped the trap of sub-par remedial education and a life of low expectations.
My mom and the therapist learned about this new field of word recall issues. Before that time, no one was discussing any of this and speech therapists weren’t being taught about it. It was serendipity that I was beginning school at the time and nearby where this new field was being developed. With this new knowledge, my mom worked with my therapist to help me with word recall (along with a learning disability therapist, Diane Redfield, who taught me to read).
One of the things that helped me the most was the information chunking. My mom explained that this had to do with not just grouping similar words. It has to do with looking at words from every angle in order to understand its different aspects. It is a shifting of perspective and a breaking down into component parts. This is what allows word groupings to be useful. Grouping words goes hand in hand with chunking information. The more kinds of groupings and chunkings the increased capacity to think and communicate clearly.
I had an example of this just last night. I was thinking of early 20th century anarchists and I was trying to recall one specific person. From the word ‘anarchist’, I thought of women’s clinic. Then from that I connected to the last name Goldman. Once I had the last name, I could recall the first name and so had the full name: Emma Goldman. I couldn’t just pull the name out by itself. I had to go through a process to get to it.
That isn’t my only method. I also use something similar to chunking that is more on a feeling level. I get an overall sense of something, a person or an idea or whatever. Once I have that sense, I just have to switch into the right state of mind and slowly feel into it. Anything I’m familiar with has a feeling-sense associated with it. This form of recall isn’t always efficient, but it works when I can’t use a direct chain of connections. This feeling-sense is very useful in general, though, for it allows me to chunk info in larger ways and helps me in feeling out patterns by sensing resonances.
All of this fits into why I’ve come to suspect I have Aspergers or something very similar. The one thing that demonstrated I wasn’t low IQ as a child was my ability to see patterns. This is also a talent of many Aspies. It is because Aspies see things in chunks of details that they are able to more flexibly scan for patterns. It is precisely where various chunks crossover that a whole begins to form, but this is building from the bottom up.
I do this in my thinking and writing. When taken to its extreme, I call them thought-webs. Connections form, connections build upon connections, and then a sense of meaning emerges from that. It is an organic process of synthesizing, rather than analyzing, although analyzing may follow as a secondary process. It is looking to the data to speak for itself, finding the harmony between the seemingly diparate.
It has its strengths and weaknesses. It is greatest strength is for research. My Asperger-like extraverted intuition (MBTI Ne) goes off in a million directions finding all the details until my brain is overloaded. Then begins the filtering and consolidating of it all into a unique synthesis, but that last part can be a doozy. I sometimes never get past the brain overload.
* More recently, I’ve learned of specific language impairment. It can have behavioral symptoms similar to autism, but it’s a different condition and much more common. It’s another possibility in describing my own difficulties, as much of it fits my experience.
—
As a side note, there is a reason I mentioned above that Deerfield is a wealthy Jewish suburb of Chicago. Here is an interesting detail of Deerfield’s history (from Wikipedia):
“In 1959, when Deerfield officials learned that a developer building a neighborhood of large new homes planned to make houses available to African Americans, they issued a stop-work order. An intense debate began about racial integration, property values, and the good faith of community officials and builders. For a brief time, Deerfield was spotlighted in the national news as “the Little Rock of the North.” Supporters of integration were denounced and ostracized by angry residents. Eventually, the village passed a referendum to build parks on the property, thus putting an end to the housing development. Two model homes already partially completed were sold to village officials. The remaining land lay dormant for years before it was developed into what is now Mitchell Pool and Park and Jaycee Park. At the time, Deerfield’s black population was 12 people out of a total population of 11,786. This episode in Deerfield’s history is described in But Not Next Door by Harry and David Rosen, both residents of Deerfield. “Since the early 1980s, however, Deerfield has seen a large influx of Jews and, more recently, Asians and Greeks, giving the community a more diverse ethnic makeup.”
I guess it was a wealthy Jewish suburb of Chicago that has become a wealthy Jewish, Asian and Greek suburb of Chicago.
I can tell you one thing for certain. Few poor kids, especially poor minorities, are privileged in the way I was by my early education opportunities. I went to a public school in Deerfield, but that is way different than going to a public school in the inner city of Chicago. If I had been a poor black kid in a poor black neighborhood, I would have been designated low IQ and that would have been the end of it.
How many poor black kids failing in school are as intelligent as I am? The evidence points to the answer being many.
It is one thing to experience something like a learning disability or Aspergers. It is a whole other matter to deal with a learning disability or Aspergers while dealing with poverty and prejudice.
Even ignoring racism, classism by itself is a powerful form of prejudice. My mom was raised working class and she raised us with a working class sensibility. This meant she dressed us working class. My older brother was ridiculed in the Deerfield public school. It scarred him for life and it contributed to his hatred of school ever after. Part of that had to do with our having previously lived in Bellefontaine, Ohio which is a factory town at the edge of Appalachia. Apparently, we had picked up a bit of Appalachian speech, in that the rich kids in Deerfield ridiculed Clay for saying ‘zeero’ when meaning ‘zero’.
It was a clear giveaway to our class background. So, even though we were technically upper middle class because my dad was a factory manager, we were new money upper middle class and the other kids knew it. I was, at that time, fortunate enough to have been too young to understand and maybe, because of my Aspergers, too socially oblivious to care.
If such minor forms of prejudice could have such powerful impact on my brother, imagine what more severe (and systemic) forms of prejudice will do to a child. To this day, my brother remains traumatized from his childhood experience of class prejudice and, sadly, has internalized it in ridiculing his ‘white trash’ neighbors in the small working class town he now lives in. Racism and classism, they are shitty mentalities that cause much damage, but unless you’ve been on the receiving end of prejudice it is hard to understand and appreciate.
* * *
Below is part of the section from Ungifted where Aspergers is discussed.
pp. 223-226:
An alternative perspective, which has gained a lot of research support in recent years, is that autism is merely a different way of processing incoming information. 23 Individuals with ASD have a greater attention to detail and tend to adopt a bottom-up strategy— they first perceive the parts of an object and then build up to the whole. 24 As Uta Frith puts it, people with autism have difficulty “seeing the forest for the trees.” There is neurological evidence that the unique mind of the person with ASD is due in part to an excessive number of short-distance, disorganized local connections in the prefrontal cortex (required for attention to detail) along with a reduced number of long-range or global connections necessary for integrating information from widespread and diverse brain regions. 25 As a result, people with high-functioning autism tend to have difficulty switching attention from the local to the global level. 26
This sometimes plays itself out in social communications. People with ASD focus on details in the environment most people find “irrelevant,” which can lead to some awkward social encounters. When people with ASD are shown photographs with social information (such as friends chatting) or movie clips from soap operas, their attention is focused much less on the people’s faces and eyes than the background scenery, such as light switches. 27 Differences among toddlers in attention to social speech is a robust predictor of ASD, and social attention differences in preschool lead to a deficit in theory of mind. 28 This is important , considering that an early lack of attention to social information can deprive the developing child of the social inputs and learning opportunities they require to develop expertise in social cognition. 29 It’s likely that from multiple unrewarding social interactions during the course of development, people with ASD learn that social interactions are unrewarding, and retreat even further into themselves.
Kate O’Connor and Ian Kirk argue that the atypical social behaviors found in people with ASD are more likely the result of a processing difference than a social deficit, and may represent a strategy to filter out too much sensory information . 30 Indeed , people with ASD often report emotional confusion during social interactions, in which they interpret expressions, gestures, and body language to mean something different from or even the opposite of what the other person intended. 31 Many people with ASD report that the eye region is particularly “confusing” and “frightening.” 32
Indeed, the eye region is very complex, transmitting a lot of information in a brief time span. For one thing, it’s always in motion (blinking, squinting, saccadic movement, and so on). But the eye region also can depict a wide range of emotions in rapid succession. It’s likely that over the course of many overwhelming interactions with people in the context of other sensory information coming in from the environment, people with ASD learn to look less at the eye region of faces. 33 People with ASD do frequently report being distracted by sensory information in the environment, including background noise, fluorescent light, shiny objects, body movement, and smells. 34
[ . . . ]
One robust finding is that people with ASD have enhanced perceptual functioning. 40 People with ASD tend to perform better than people without ASD symptoms on IQ subtests that involve nonverbal fluid reasoning and the segmentation and reconstruction of novel visual designs. 41 Individuals with ASD also perform better than controls on the Embedded Figures Task (EFT), which requires quick detection of a target within a complex pattern. 42 The ASD tendency to see patterns as collections of details instead of as wholes helps people with ASD to segment and chunk visual information, freeing up visual working memory resources and allowing them to handle a higher perceptual load than typical adults. 43