Autism and the Upper Crust

There are multiple folktales about the tender senses of royalty, aristocrats, and other elite. The most well known example is “The Princess and the Pea”. In the Aarne-Thompson-Uther system of folktale categorization, it gets listed as type 704 about the search for a sensitive wife. That isn’t to say that all the narrative variants of elite sensitivity involve potential wives. Anyway, the man who made this particular story famous is Hans Christian Andersen, having published his translation in 1835. He longed to be a part of the respectable class, but felt excluded. Some speculate that he projected his own class issues onto his slightly altered version of the folktale, something discussed in the Wikipedia article about the story:

“Wullschlager observes that in “The Princess and the Pea” Andersen blended his childhood memories of a primitive world of violence, death and inexorable fate, with his social climber’s private romance about the serene, secure and cultivated Danish bourgeoisie, which did not quite accept him as one of their own. Researcher Jack Zipes said that Andersen, during his lifetime, “was obliged to act as a dominated subject within the dominant social circles despite his fame and recognition as a writer”; Andersen therefore developed a feared and loved view of the aristocracy. Others have said that Andersen constantly felt as though he did not belong, and longed to be a part of the upper class.[11] The nervousness and humiliations Andersen suffered in the presence of the bourgeoisie were mythologized by the storyteller in the tale of “The Princess and the Pea”, with Andersen himself the morbidly sensitive princess who can feel a pea through 20 mattresses.[12]Maria Tatar notes that, unlike the folk heroine of his source material for the story, Andersen’s princess has no need to resort to deceit to establish her identity; her sensitivity is enough to validate her nobility. For Andersen, she indicates, “true” nobility derived not from an individual’s birth but from their sensitivity. Andersen’s insistence upon sensitivity as the exclusive privilege of nobility challenges modern notions about character and social worth. The princess’s sensitivity, however, may be a metaphor for her depth of feeling and compassion.[1] […] Researcher Jack Zipes notes that the tale is told tongue-in-cheek, with Andersen poking fun at the “curious and ridiculous” measures taken by the nobility to establish the value of bloodlines. He also notes that the author makes a case for sensitivity being the decisive factor in determining royal authenticity and that Andersen “never tired of glorifying the sensitive nature of an elite class of people”.[15]

Even if that is true, there is more going on here than some guy working out his personal issues through fiction. This princess’ sensory sensitivity sounds like autism spectrum disorder and I have a theory about that. Autism has been associated with certain foods like wheat, specifically refined flour in highly processed foods (The Agricultural Mind). And a high-carb diet in general causes numerous neurocognitive problems (Ketogenic Diet and Neurocognitive Health), along with other health conditions such as metabolic syndrome (Dietary Dogma: Tested and Failed) and insulin resistance (Coping Mechanisms of Health), atherosclerosis (Ancient Atherosclerosis?) and scurvy (Sailors’ Rations, a High-Carb Diet) — by the way, the rates of these diseases have been increasing over the generations and often first appearing among the affluent. Sure, grains have long been part of the diet, but the one grain that had most been associated with the wealthy going back millennia was wheat, as it was harder to grow which caused it to be in short supply and so expensive. Indeed, it is wheat, not the other grains, that gets brought up in relation to autism. This is largely because of gluten, though other things have been pointed to.

It is relevant that the historical period in which these stories were written down was around when the first large grain surpluses were becoming common and so bread, white bread most of all, became a greater part of the diet. But as part of the diet, this was first seen among the upper classes. It’s too bad we don’t have cross-generational data on autism rates in terms of demographic and dietary breakdown, but it is interesting to note that the mental health condition neurasthenia, also involving sensitivity, from the 19th century was seen as a disease of the middle-to-upper class (The Crisis of Identity), and this notion of the elite as sensitive was a romanticized ideal going back to the 1700s with what Jane Austen referred to as ‘sensibility’ (see Bryan Kozlowski’s The Jane Austen Diet, as quoted in the link immediately above). In that same historical period, others noted that schizophrenia was spreading along with civilization (e.g., Samuel Gridley Howe and Henry Maudsley; see The Invisible Plague by Edwin Fuller Torrey & Judy Miller) and I’d add the point that there appear to be some overlapping factors between schizophrenia and autism — besides gluten, some of the implicated factors are glutamate, exorphins, inflammation, etc. “It is unlikely,” writes William Davis, “that wheat exposure was the initial cause of autism or ADHD but, as with schizophrenia, wheat appears to be associated with worsening characteristics of the conditions” (Wheat Belly, p. 48).

For most of human history, crop failures and famine were a regular occurrence. And this most harshly affected the poor masses when grain and bread prices went up, leading to food riots and sometimes revolutions (e.g., French Revolution). Before the 1800s, grains were so expensive that, in order to make them affordable, breads were often adulterated with fillers or entirely replaced with grain substitutes, the latter referred to as “famine breads” and sometimes made with tree bark. Even when available, the average person might be spending most of their money on bread, as it was one of the most costly foods around and other foods weren’t always easily obtained.

Even so, grain being highly sought after certainly doesn’t imply that the average person was eating a high-carb diet, quite the opposite (A Common Diet). Food in general was expensive and scarce and, among grains, wheat was the least common. At times, this would have forced feudal peasants and later landless peasants onto a diet limited in both carbohydrates and calories, which would have meant a typically ketogenic state (Fasting, Calorie Restriction, and Ketosis), albeit far from an optimal way of achieving it. The further back in time one looks the greater prevalence would have been ketosis (e.g., Spartan  and Mongol diet), maybe with the exception of the ancient Egyptians (Ancient Atherosclerosis?). In places like Ireland, Russia, etc, the lower classes remained on this poverty diet that was often a starvation diet well into the mid-to-late 1800s, although in the case of the Irish it was an artificially constructed famine as the potato crop was essentially being stolen by the English and sold on the international market.

Yet, in America, the poor were fortunate in being able to rely on a meat-based diet because wild game was widely available and easily obtained, even in cities. That may have been true for many European populations as well during earlier feudalism, specifically prior to the peasants being restricted in hunting and trapping on the commons. This is demonstrated by how health improved after the fall of the Roman Empire (Malnourished Americans). During this earlier period, only the wealthy could afford high-quality bread and large amounts of grain-based foods in general. That meant highly refined and fluffy white bread that couldn’t easily be adulterated. Likewise, for the early centuries of colonialism, sugar was only available to the wealthy — in fact, it was a controlled substance typically only found in pharmacies. But for the elite who had access, sugary pastries and other starchy dessert foods became popular. White bread and pastries were status symbols. Sugar was so scarce that wealthy households kept it locked away so the servants couldn’t steal it. Even fruit was disproportionately eaten by the wealthy. A fruit pie would truly have been a luxury with all three above ingredients combined in a single delicacy.

Part of the context is that, although grain yields had been increasing during the early colonial era, there weren’t dependable surplus yields of grains before the 1800s. Until then, white bread, pastries, and such simply were not affordable to most people. Consumption of grains, along with other starchy carbs and sugar, rose with 19th century advancements in agriculture. Simultaneously, income was increasing and the middle class was growing. But even as yields increased, most of the created surplus grains went to feeding livestock, not to feeding the poor. Grains were perceived as cattle feed. Protein consumption increased more than did carbohydrate consumption, at least initially. The American population, in particular, didn’t see the development of a high-carb diet until much later, as related to US mass urbanization also happening later.

Coming to the end of the 19th century, there was the emergence of the mass diet of starchy and sugary foods, especially the spread of wheat farming and white bread. And, in the US, only by the 20th century did grain consumption finally surpass meat consumption. Following that, there has been growing rates of autism. Along with sensory sensitivity, autistics are well known for their pickiness about foods and well known for cravings for particular foods such as those made from highly refined wheat flour, from white bread to crackers. Yet the folktales in question were speaking to a still living memory of an earlier time when these changes had yet to happen. Hans Christian Andersen first published “The Princess and the Pea” in 1835, but such stories had been orally told long before that, probably going back at least centuries, although we now know that some of these folktales have their origins millennia earlier, even into the Bronze Age. According to the Wikipedia article on “The Princess and the Pea”,

“The theme of this fairy tale is a repeat of that of the medieval Perso-Arabic legend of al-Nadirah.[6] […] Tales of extreme sensitivity are infrequent in world culture but a few have been recorded. As early as the 1st century, Seneca the Younger had mentioned a legend about a Sybaris native who slept on a bed of roses and suffered due to one petal folding over.[23] The 11th-century Kathasaritsagara by Somadeva tells of a young man who claims to be especially fastidious about beds. After sleeping in a bed on top of seven mattresses and newly made with clean sheets, the young man rises in great pain. A crooked red mark is discovered on his body and upon investigation a hair is found on the bottom-most mattress of the bed.[5] An Italian tale called “The Most Sensitive Woman” tells of a woman whose foot is bandaged after a jasmine petal falls upon it.”

I would take it as telling that, in the case of this particular folktale, it doesn’t appear to be as ancient as other examples. That would support my argument that the sensory sensitivity of autism might be caused by greater consumption of refined wheat, something that only began to appear late in the Axial Age and only became common much later. Even for the few wealthy that did have access in ancient times, they were eating rather limited amounts of white bread. It might have required hitting a certain level of intake, not seen until modernity or closer to it, before the extreme autistic symptoms became noticeable among a larger number of the aristocracy and monarchy.

* * *

Sources

Others have connected such folktales of sensitivity with autism:

The high cost and elite status of grains, especially white bread, prior to 19th century high yields:

The Life of a Whole Grain Junkie
by Seema Chandra

Did you know where the term refined comes from? Around 1826, whole grain bread used by the military was called superior for health versus the white refined bread used by the aristocracy. Before the industrial revolution, it was more labor consuming and more expensive to refine bread, so white bread was the main staple loaf for aristocracy. That’s why it was called “refined”.

The War on White Bread
by Livia Gershon

Bread has always been political. For Romans, it helped define class; white bread was for aristocrats, while the darkest brown loaves were for the poor. Later, Jacobin radicals claimed white bread for the masses, while bread riots have been a perennial theme of populist uprisings. But the political meaning of the staff of life changed dramatically in the early twentieth-century United States, as Aaron Bobrow-Strain, who went on to write the book White Bread, explained in a 2007 paper. […]

Even before this industrialization of baking, white flour had had its critics, like cracker inventor William Sylvester Graham. Now, dietary experts warned that white bread was, in the words of one doctor, “so clean a meal worm can’t live on it for want of nourishment.” Or, as doctor and radio host P.L. Clark told his audience, “the whiter your bread, the sooner you’re dead.”

Nutrition and Economic Development in the Eighteenth-Century Habsburg Monarchy: An Anthropometric History
by John Komlos
p.31

Furthermore, one should not disregard the cultural context of food consumption. Habits may develop that prevent the attainment of a level of nutritional status commensurate with actual real income. For instance, the consumption of white bread or of polished rice, instead of whole-wheat bread or unpolished rice, might increase with income, but might detract from the body’s well-being. Insofar as cultural habits change gradually over time, significant lags could develop between income and nutritional status.

pp. 192-194

As consequence, per capita food consumption could have increased between 1660 and 1740 by as much as 50 percent. The fact that real wages were higher in the 1730s than at any time since 1537 indicates a high standard of living was reached. The increase in grain exports, from 2.8 million quintals in the first decade of the eighteenth century to 6 million by the 1740s, is also indicative of the availability of nutrients.

The remarkably good harvests were brought about by the favorable weather conditions of the 1730s. In England the first four decades of the eighteenth century were much warmer than the last decades of the previous century (Table 5.1). Even small differences in temperature may have important consequences for production. […] As a consequence of high yields the price of consumables declined by 14 percent in the 1730s relative to the 1720s. Wheat cost 30 percent less in the 1730s than it did in the 1660s. […] The increase in wheat consumption was particularly important because wheat was less susceptible to mold than rye. […]

There is direct evidence that the nutritional status of many populations was, indeed, improving in the early part of the eighteenth century, because human stature was generally increasing in Europe as well as in America (see Chapter 2). This is a strong indication that protein and caloric intake rose. In the British colonies of North America, an increase in food consumption—most importantly, of animal protein—in the beginning of the eighteenth century has been directly documented. Institutional menus also indicate that diets improved in terms of caloric content.

Changes in British income distribution conform to the above pattern. Low food prices meant that the bottom 40 percent of the distribution was gaining between 1688 and 1759, but by 1800 had declined again to the level of 1688. This trend is another indication that a substantial portion of the population that was at a nutritional disadvantage was doing better during the first half of the eighteenth century than it did earlier, but that the gains were not maintained throughout the century.

The Roots of Rural Capitalism: Western Massachusetts, 1780-1860
By Christopher Clark
p. 77

Livestock also served another role, as a kind of “regulator,” balancing the economy’s need for sufficiency and the problems of producing too much. In good years, when grain and hay were plentiful, surpluses could be directed to fattening cattle and hogs for slaughter, or for exports to Boston and other markets on the hoof. Butter and cheese production would also rise, for sale as well as for family consumption. In poorer crop years, however, with feedstuffs rarer, cattle and swine could be slaughtered in greater numbers for household and local consumption, or for export as dried meat.

p. 82

Increased crop and livestock production were linked. As grain supplies began to overtake local population increases, more corn in particular became available for animal feed. Together with hay, this provided sufficient feedstuffs for farmers in the older Valley towns to undertake winter cattle fattening on a regular basis, without such concern as they had once had for fluctuations in output near the margins of subsistence. Winter fattening for market became an established practice on more farms.

When Food Changed History: The French Revolution
by Lisa Bramen

But food played an even larger role in the French Revolution just a few years later. According to Cuisine and Culture: A History of Food and People, by Linda Civitello, two of the most essential elements of French cuisine, bread and salt, were at the heart of the conflict; bread, in particular, was tied up with the national identity. “Bread was considered a public service necessary to keep the people from rioting,” Civitello writes. “Bakers, therefore, were public servants, so the police controlled all aspects of bread production.”

If bread seems a trifling reason to riot, consider that it was far more than something to sop up bouillabaisse for nearly everyone but the aristocracy—it was the main component of the working Frenchman’s diet. According to Sylvia Neely’s A Concise History of the French Revolution, the average 18th-century worker spent half his daily wage on bread. But when the grain crops failed two years in a row, in 1788 and 1789, the price of bread shot up to 88 percent of his wages. Many blamed the ruling class for the resulting famine and economic upheaval.
Read more: https://www.smithsonianmag.com/arts-culture/when-food-changed-history-the-french-revolution-93598442/#veXc1rXUTkpXSiMR.99
Give the gift of Smithsonian magazine for only $12! http://bit.ly/1cGUiGv
Follow us: @SmithsonianMag on Twitter

What Brought on the French Revolution?
by H.A. Scott Trask

Through 1788 and into 1789 the gods seemed to be conspiring to bring on a popular revolution. A spring drought was followed by a devastating hail storm in July. Crops were ruined. There followed one of the coldest winters in French history. Grain prices skyrocketed. Even in the best of times, an artisan or factor might spend 40 percent of his income on bread. By the end of the year, 80 percent was not unusual. “It was the connection of anger with hunger that made the Revolution possible,” observed Schama. It was also envy that drove the Revolution to its violent excesses and destructive reform.

Take the Reveillon riots of April 1789. Reveillon was a successful Parisian wall-paper manufacturer. He was not a noble but a self-made man who had begun as an apprentice paper worker but now owned a factory that employed 400 well-paid operatives. He exported his finished products to England (no mean feat). The key to his success was technical innovation, machinery, the concentration of labor, and the integration of industrial processes, but for all these the artisans of his district saw him as a threat to their jobs. When he spoke out in favor of the deregulation of bread distribution at an electoral meeting, an angry crowded marched on his factory, wrecked it, and ransacked his home.

Why did our ancestors prefer white bread to wholegrains?
by Rachel Laudan

Only in the late nineteenth and twentieth century did large numbers of “our ancestors”–and obviously this depends on which part of the world they lived in–begin eating white bread. […]

Wheat bread was for the few. Wheat did not yield well (only seven or eight grains for one planted compared to corn that yielded dozens) and is fairly tricky to grow.

White puffy wheat bread was for even fewer. Whiteness was achieved by sieving out the skin of the grain (bran) and the germ (the bit that feeds the new plant). In a world of scarcity, this made wheat bread pricey. And puffy, well, that takes fairly skilled baking plus either yeast from beer or the kind of climate that sourdough does well in. […]

Between 1850 and 1950, the price of wheat bread, even white wheat bread, plummeted in price as a result of the opening up of new farms in the US and Canada, Argentina, Australia and other places, the mechanization of plowing and harvesting, the introduction of huge new flour mills, and the development of continuous flow bakeries.

In 1800 only half the British population could afford wheat bread. In 1900 everybody could.

History of bread – Industrial age
The Industrial Age (1700 – 1887)
from The Federation of Bakers

In Georgian times the introduction of sieves made of Chinese silk helped to produce finer, whiter flour and white bread gradually became more widespread. […]

1757
A report accused bakers of adulterating bread by using alum lime, chalk and powdered bones to keep it very white. Parliament banned alum and all other additives in bread but some bakers ignored the ban. […]

1815
The Corn Laws were passed to protect British wheat growers. The duty on imported wheat was raised and price controls on bread lifted. Bread prices rose sharply. […]

1826
Wholemeal bread, eaten by the military, was recommended as being healthier than the white bread eaten by the aristocracy.

1834
Rollermills were invented in Switzerland. Whereas stonegrinding crushed the grain, distributing the vitamins and nutrients evenly, the rollermill broke open the wheat berry and allowed easy separation of the wheat germ and bran. This process greatly eased the production of white flour but it was not until the 1870s that it became economic. Steel rollermills gradually replaced the old windmills and watermills.

1846
With large groups of the population near to starvation the Corn Laws were repealed and the duty on imported grain was removed. Importing good quality North American wheat enabled white bread to be made at a reasonable cost. Together with the introduction of the rollermill this led to the increase in the general consumption of white bread – for so long the privilege of the upper classes.

Of all foods bread is the most noble: Carl von Linné (Carl Linneaus) on bread
by Leena Räsänen

In many contexts Linné explained how people with different standing in society eat different types of bread. He wrote, “Wheat bread, the most excellent of all, is used only by high-class people”, whereas “barley bread is used by our peasants” and “oat bread is common among the poor”. He made a remark that “the upper classes use milk instead of water in the dough, as they wish to have a whiter and better bread, which thereby acquires a more pleasant taste”. He compared his own knowledge on the food habits of Swedish society with those mentioned in classical literature. Thus, according to Linné, Juvenal wrote that “a soft and snow-white bread of the finest wheat is given to the master”, while Galen condemned oat bread as suitable only for cattle, not for humans. Here Linné had to admit that it is, however, consumed in certain provinces in Sweden.

Linné was aware of and discussed the consequences of consuming less tasty and less satisfying bread, but he seems to have accepted as a fact that people belonging to different social classes should use different foods to satisfy their hunger. For example, he commented that “bran is more difficult to digest than flour, except for hard-labouring peasants and the likes, who are scarcely troubled by it”. The necessity of having to eat filling but less palatable bread was inevitable, but could be even positive from the nutritional point of view. “In Östergötland they mix the grain with flour made from peas and in Scania with vetch, so that the bread may be more nutritious for the hard-working peasants, but at the same time it becomes less flavoursome, drier and less pleasing to the palate.” And, “Soft bread is used mainly by the aristocracy and the rich, but it weakens the gums and teeth, which get too little exercise in chewing. However, the peasant folk who eat hard bread cakes generally have stronger teeth and firmer gums”.

It is intriguing that Linné did not find it necessary to discuss the consumption or effect on health of other bakery products, such as the sweet cakes, tarts, pies and biscuits served by the fashion-conscious upper class and the most prosperous bourgeois. Several cookery books with recipes for the fashionable pastry products were published in Sweden in the eighteenth century 14. The most famous of these, Hjelpreda i Hushållningen för Unga Fruentimmer by Kajsa Warg, published in 1755, included many recipes for sweet pastries 15. Linné mentioned only in passing that the addition of egg makes the bread moist and crumbly, and sugar and currants impart a good flavour.

The sweet and decorated pastries were usually consumed with wine or with the new exotic beverages, tea and coffee. It is probable that Linné regarded pastries as unnecessary luxuries, since expensive imported ingredients, sugar and spices, were indispensable in their preparation. […]

Linné emphasized that soft and fresh bread does not draw in as much saliva and thus remains undigested for a long time, “like a stone in the stomach”. He strongly warned against eating warm bread with butter. While it was “considered as a delicacy, there was scarcely another food that was more damaging for the stomach and teeth, for they were loosen’d by it and fell out”. By way of illustration he told an example reported by a doctor who lived in a town near Amsterdam. Most of the inhabitants of this town were bakers, who sold bread daily to the residents of Amsterdam and had the practice of attracting customers with oven-warm bread, sliced and spread with butter. According to Linné, this particular doctor was not surprised when most of the residents of this town “suffered from bad stomach, poor digestion, flatulence, hysterical afflictions and 600 other problems”. […]

Linné was not the first in Sweden to write about famine bread. Among his remaining papers in London there are copies from two official documents from 1696 concerning the crop failure in the northern parts of Sweden and the possibility of preparing flour from different roots, and an anonymous small paper which contained descriptions of 21 plants, the roots or leaves of which could be used for flour 10. These texts had obviously been studied by Linné with interest.

When writing about substitute breads, Linné formulated his aim as the following: “It will teach the poor peasant to bake bread with little or no grain in the circumstance of crop failure without destroying the body and health with unnatural foods, as often happens in the countryside in years of hardship” 10.

Linné’s idea for a publication on bread substitutes probably originated during his early journeys to Lapland and Dalarna, where grain substitutes were a necessity even in good years. Actually, bark bread was eaten in northern Sweden until the late nineteenth century 4. In the poorest regions of eastern and north-eastern Finland it was still consumed in the 1920s 26. […]

Bark bread has been used in the subarctic area since prehistoric times 4. According to Linné, no other bread was such a common famine bread. He described how in springtime the soft inner layer can be removed from debarked pine trees, cleaned of any remaining bark, roasted or soaked to remove the resin, and dried and ground into flour. Linné had obviously eaten bark bread, since he could say that “it tastes rather well, is however more bitter than other bread”. His view of bark bread was most positive but perhaps unrealistic: “People not only sustain themselves on this, but also often become corpulent of it, indeed long for it.” Linné’s high regard for bark bread was shared by many of his contemporaries, but not all. For example, Pehr Adrian Gadd, the first professor of chemistry in Turku (Åbo) Academy and one of the most prominent utilitarians in Finland, condemned bark bread as “useless, if not harmful to use” 28. In Sweden, Anders Johan Retzius, a professor in Lund and an expert on the economic and pharmacological potential of Swedish flora, called bark bread “a paltry food, with which they can hardly survive and of which they always after some time get a swollen body, pale and bluish skin, big and hard stomach, constipation and finally dropsy, which ends the misery” 4. […]

Linné’s investigations of substitutes for grain became of practical service when a failed harvest of the previous summer was followed by famine in 1757 10. Linné sent a memorandum to King Adolf Fredrik in the spring of 1757 and pointed out the risk to the health of the hungry people when they ignorantly chose unsuitable plants as a substitute for grain. He included a short paper on the indigenous plants which in the shortage of grain could be used in bread-making and other cooking. His Majesty immediately permitted this leaflet to be printed at public expense and distributed throughout the country 10. Soon Linné’s recipes using wild flora were read out in churches across Sweden. In Berättelse om The inhemska wäxter, som i brist af Säd kunna anwändas til Bröd- och Matredning, Linné 32 described the habitats and the popular names of about 30 edible wild plants, eight of which were recommended for bread-making.

The Agricultural Mind

Let me make an argument about individualism, rigid egoic boundaries, and hence Jaynesian consciousness. But I’ll come at it from a less typical angle. I’ve been reading much about diet, nutrition, and health. With agriculture, the entire environment in which humans lived was fundamentally transformed, such as the rise of inequality and hierarchy, concentrated wealth and centralized power; not to mention the increase of parasites and diseases from urbanization and close cohabitation with farm animals (The World Around Us). We might be able to thank early agricultural societies for introducing malaria to the world. Maybe more importantly, there are significant links between what we eat and so much else: gut health, hormonal regulation, immune system, and neurocognitive functioning. There are multiple pathways, one of which is direct, connecting the gut and the brain. The gut is sometimes called the second brain, but in evolutionary terms it is the first brain. To demonstrate one example of a connection, many are beginning to refer to Alzheimer’s as type 3 diabetes, and dietary interventions have reversed symptoms in clinical studies. Also, microbes and parasites have been shown to influence our neurocognition and psychology, even altering personality traits and behavior (e.g., toxoplasma gondii).

One possibility to consider is the role of exorphins that are addictive and can be blocked in the same way as opioids. Exorphin, in fact, means external morphine-like substance, in the way that endorphin means indwelling morphine-like substance. Exorphins are found in milk and wheat. Milk, in particular, stands out. Even though exorphins are found in other foods, it’s been argued that they are insignificant because they theoretically can’t pass through the gut barrier, much less the blood-brain barrier. Yet exorphins have been measured elsewhere in the human body. One explanation is gut permeability that can be caused by many factors such as stress but also by milk. The purpose of milk is to get nutrients into the calf and this is done by widening the space in gut surface to allow more nutrients through the protective barrier. Exorphins get in as well and create a pleasurable experience to motivate the calf to drink more. Along with exorphins, grains and dairy also contain dopaminergic peptides, and dopamine is the other major addictive substance. It feels good to consume dairy as with wheat, whether you’re a calf or a human, and so one wants more.

Addiction, of food or drugs or anything else, is a powerful force. And it is complex in what it affects, not only physiologically and psychologically but also on a social level. Johann Hari offers a great analysis in Chasing the Scream. He makes the case that addiction is largely about isolation and that the addict is the ultimate individual (see To Put the Rat Back in the Rat Park, Rationalizing the Rat Race, Imagining the Rat Park, & Individualism and Isolation), and by the way this connects to Jaynesian consciousness with its rigid egoic boundaries as opposed to the bundled and porous mind, the extended and enmeshed self of bicameralism and animism. It stands out to me that addiction and addictive substances have increased over civilization, and I’ve argued that this is about a totalizing cultural system and a fully encompassing ideological worldview, what some call a reality tunnel (see discussion of addiction and social control in Diets and Systems & Western Individuality Before the Enlightenment Age). Growing of poppies, sugar, etc came later on in civilization, as did the production of beer and wine — by the way, alcohol releases endorphins, sugar causes a serotonin high, and both activate the hedonic pathway. Also, grain and dairy were slow to catch on, as a large part of the diet. Until recent centuries, most populations remained dependent on animal foods, including wild game (I discuss this era of dietary transition and societal transformation in numerous posts with industrialization and technology pushing the already stressed agricultural mind to an extreme: Ancient Atherosclerosis?To Be Fat And Have Bread, Autism and the Upper Crust“Yes, tea banished the fairies.”, Voice and Perspective, Hubris of Nutritionism, Health From Generation To GenerationDietary Health Across GenerationsMoral Panic and Physical DegenerationThe Crisis of IdentityThe Disease of Nostalgia, & Technological Fears and Media Panics). Americans, for example, ate large amounts of meat, butter, and lard from the colonial era through the 19th century (see Nina Teicholz, The Big Fat Surprise; passage quoted in full at Malnourished Americans). In 1900, Americans on average were only getting 10% of carbs as part of their diet and sugar was minimal.

Something else to consider is that low-carb diets can alter how the body and brain functions. That is even more true if combined with intermittent fasting and restricted eating times that would have been more common in the past. Taken together, earlier humans would have spent more time in ketosis (fat-burning mode, as opposed to glucose-burning) which dramatically affects human biology. The further one goes back in history the greater amount of time people probably spent in ketosis. One difference with ketosis is cravings and food addictions disappear. It’s a non-addictive or maybe even anti-addictive state of mind. (For more discussion of this topic, see previous posts: Fasting, Calorie Restriction, and Ketosis, Ketogenic Diet and Neurocognitive HealthIs Ketosis Normal?, & “Is keto safe for kids?”.) Many hunter-gatherer tribes can go days without eating and it doesn’t appear to bother them, such as Daniel Everett’s account of the Piraha, and that is typical of ketosis. This was also observed of Mongol warriors who could ride and fight for days on end without tiring or needing to stop for food. What is also different about hunter-gatherers and similar traditional societies is how communal they are or were and how more expansive their identities in belonging to a group. Anthropological research shows how hunter-gatherers often have a sense of personal space that extends into the environment around them. What if that isn’t merely cultural but something to do with how their bodies and brains operate? Maybe diet even plays a role. Hold that thought for a moment.

Now go back to the two staples of the modern diet, grains and dairy. Besides exorphins and dopaminergic substances, they also have high levels of glutamate, as part of gluten and casein respectively. Dr. Katherine Reid is a biochemist whose daughter was diagnosed with autism and it was severe. She went into research mode and experimented with supplementation and then diet. Many things seemed to help, but the greatest result came from restriction of glutamate, a difficult challenge as it is a common food additive (see her TED talk here and another talk here or, for a short and informal video, look here). This requires going on a largely whole foods diet, that is to say eliminating processed foods. But when dealing with a serious issue, it is worth the effort. Dr. Reid’s daughter showed immense improvement to such a degree that she was kicked out of the special needs school. After being on this diet for a while, she socialized and communicated normally like any other child, something she was previously incapable of. Keep in mind that glutamate is necessary as a foundational neurotransmitter in modulating communication between the gut and brain. But typically we only get small amounts of it, as opposed to the large doses found in the modern diet. In response to the TED Talk given by Reid, Georgia Ede commented that it’s, “Unclear if glutamate is main culprit, b/c a) little glutamate crosses blood-brain barrier; b) anything that triggers inflammation/oxidation (i.e. refined carbs) spikes brain glutamate production.” Either way, glutamate plays a powerful role in brain functioning. And no matter the exact line of causation, industrially processed foods in the modern diet would be involved.

Glutamate is also implicated in schizophrenia: “The most intriguing evidence came when the researchers gave germ-free mice fecal transplants from the schizophrenic patients. They found that “the mice behaved in a way that is reminiscent of the behavior of people with schizophrenia,” said Julio Licinio, who co-led the new work with Wong, his research partner and spouse. Mice given fecal transplants from healthy controls behaved normally. “The brains of the animals given microbes from patients with schizophrenia also showed changes in glutamate, a neurotransmitter that is thought to be dysregulated in schizophrenia,” he added. The discovery shows how altering the gut can influence an animals behavior” (Roni Dengler, Researchers Find Further Evidence That Schizophrenia is Connected to Our Guts; reporting on Peng Zheng et al, The gut microbiome from patients with schizophrenia modulates the glutamate-glutamine-GABA cycle and schizophrenia-relevant behaviors in mice, Science Advances journal). And glutamate is involved in other conditions as well, such as in relation to GABA: “But how do microbes in the gut affect [epileptic] seizures that occur in the brain? Researchers found that the microbe-mediated effects of the Ketogenic Diet decreased levels of enzymes required to produce the excitatory neurotransmitter glutamate. In turn, this increased the relative abundance of the inhibitory neurotransmitter GABA. Taken together, these results show that the microbe-mediated effects of the Ketogenic Diet have a direct effect on neural activity, further strengthening support for the emerging concept of the ‘gut-brain’ axis.” (Jason Bush, Important Ketogenic Diet Benefit is Dependent on the Gut Microbiome). Glutamate is one neurotransmitter among many that can be affected in a similar manner; e.g., serotonin is also produced in the gut.

That reminds me of propionate, a short chain fatty acid. It is another substance normally taken in at a low level. Certain foods, including grains and dairy, contain it. The problem is that, as a useful preservative, it has been generously added to the food supply. Research on rodents shows injecting them with propionate causes autistic-like behaviors. And other rodent studies show how this stunts learning ability and causes repetitive behavior (both related to the autistic demand for the familiar), as too much propionate entrenches mental patterns through the mechanism that gut microbes use to communicate to the brain how to return to a needed food source. A recent study shows that propionate not only alters brain functioning but brain development (L.S. Abdelli et al, Propionic Acid Induces Gliosis and Neuro-inflammation through Modulation of PTEN/AKT Pathway in Autism Spectrum Disorder). As reported by Suhtling Wong-Vienneau at University of Central Florida, “when fetal-derived neural stem cells are exposed to high levels of Propionic Acid (PPA), an additive commonly found in processed foods, it decreases neuron development” (Processed Foods May Hold Key to Rise in Autism). This study “is the first to discover the molecular link between elevated levels of PPA, proliferation of glial cells, disturbed neural circuitry and autism.” The impact is profound and permanent — Pedersen offers the details:

“In the lab, the scientists discovered that exposing neural stem cells to excessive PPA damages brain cells in several ways: First, the acid disrupts the natural balance between brain cells by reducing the number of neurons and over-producing glial cells. And although glial cells help develop and protect neuron function, too many glia cells disturb connectivity between neurons. They also cause inflammation, which has been noted in the brains of autistic children. In addition, excessive amounts of the acid shorten and damage pathways that neurons use to communicate with the rest of the body. This combination of reduced neurons and damaged pathways hinder the brain’s ability to communicate, resulting in behaviors that are often found in children with autism, including repetitive behavior, mobility issues and inability to interact with others.”

So, the autistic brain develops according to higher levels of propionate and maybe becomes accustomed to it. A state of dysfunction becomes what feels normal. Propionate causes inflammation and, as Dr. Ede points out, “anything that triggers inflammation/oxidation (i.e. refined carbs) spikes brain glutamate production”. High levels of propionate and glutamate become part of the state of mind the autistic becomes identified with. It all links together. Autistics, along with cravings for for foods containing propionate (and glutamate), tend to have larger populations of a particular gut microbe that produces propionate. In killing microbes, this might be why antibiotics can help with autism. But in the case of depression, gut issues are associated instead with the lack of certain microbes that produce butyrate, another important substance that also is found in certain foods (Mireia Valles-Colomer et al, The neuroactive potential of the human gut microbiota in quality of life and depression). Depending on the specific gut dysbiosis, diverse neurocognitive conditions can result. And in affecting the microbiome, changes in autism can be achieved through a ketogenic diet, reducing the microbiome (similar to an antibiotic) — this presumably takes care of the problematic microbes and readjusts the gut from dysbiosis to a healthier balance. Also, ketosis would reduce the inflammation that is associated with glutamate production.

As with propionate, exorphins injected into rats will likewise elicit autistic-like behaviors. By two different pathways, the body produces exorphins and propionate from the consumption of grains and dairy, the former from the breakdown of proteins and the latter produced by gut bacteria in the breakdown of some grains and refined carbohydrates (combined with the propionate used as a food additive; added to other foods as well and also, at least in rodents, artificial sweeteners increase propionate levels). This is part of the explanation for why many autistics have responded well to low-carb ketosis, specifically paleo diets that restrict both wheat and dairy, but ketones themselves play a role in using the same transporters as propionate and so block their buildup in cells and, of course, ketones offer a different energy source for cells as a replacement for glucose which alters how cells function, specifically neurocognitive functioning and its attendant psychological effects.

There are some other factors to consider as well. With agriculture came a diet high in starchy carbohydrates and sugar. This inevitably leads to increased metabolic syndrome, including diabetes. And diabetes in pregnant women is associated with autism and attention deficit disorder in children. “Maternal diabetes, if not well treated, which means hyperglycemia in utero, that increases uterine inflammation, oxidative stress and hypoxia and may alter gene expression,” explained Anny H. Xiang. “This can disrupt fetal brain development, increasing the risk for neural behavior disorders, such as autism” (Maternal HbA1c influences autism risk in offspring). The increase of diabetes, not mere increase of diagnosis, could explain the greater prevalence of autism over time. Grain surpluses only became available in the 1800s, around the time when refined flour and sugar began to become common. It wasn’t until the following century that carbohydrates finally overtook animal foods as the mainstay of the diet, specifically in terms of what is most regularly eaten throughout the day in both meals and snacks — a constant influx of glucose into the system.

A further contributing factor in modern agriculture is that of pesticides, also associated with autism. Consider DDE, a product of DDT, which has been banned for decades but apparently it is still lingering in the environment. “The odds of autism among children were increased, by 32 percent, in mothers whose DDE levels were high (high was, comparatively, 75th percentile or greater),” one study found (Aditi Vyas & Richa Kalra, Long lingering pesticides may increase risk for autism: Study). “Researchers also found,” the article reports, “that the odds of having children on the autism spectrum who also had an intellectual disability were increased more than two-fold when the mother’s DDE levels were high.” A different study showed a broader effect in terms of 11 pesticides still in use:

“They found a 10 percent or more increase in rates of autism spectrum disorder, or ASD, in children whose mothers lived during pregnancy within about a mile and a quarter of a highly sprayed area. The rates varied depending on the specific pesticide sprayed, and glyphosate was associated with a 16 percent increase. Rates of autism spectrum disorders combined with intellectual disability increased by even more, about 30 percent. Exposure after birth, in the first year of life, showed the most dramatic impact, with rates of ASD with intellectual disability increasing by 50 percent on average for children who lived within the mile-and-a-quarter range. Those who lived near glyphosate spraying showed the most increased risk, at 60 percent” (Nicole Ferox, It’s Personal: Pesticide Exposures Come at a Cost).

So far, my focus has been on what we ingest or are otherwise exposed to because of agriculture and the food system, in general and more specifically in industrialized society with its refined, processed, and adulterated foods, largely from plants. But the other side of the picture is what we are lacking, what we are deficient in. An agricultural diet hasn’t only increased certain foods and substances but simultaneously decreased others. What promoted optimal health throughout human evolution has, in many cases, been displaced or blocked. Agriculture is highly destructive and has depleted the nutrient-level in the soil (see Carnivore Is Vegan) and, along with this, even animal foods as part of the agricultural system are similarly depleted of nutrients as compared to animal foods from pasture or free-range. For example, fat-soluble vitamins (true vitamin A as retinol, vitamin D3, vitamin K2 not to be confused with K1, and vitamin E complex) are not found in plant foods and are found in far less concentration with foods from animals from factory-farming or from grazing on poor soil from agriculture, especially the threat of erosion and desertification.

One of the biggest changes with agriculture was the decrease of fatty animal foods that were nutrient-dense and nutrient-bioavailable. It’s in the fat that are found the fat-soluble vitamins, and the fat-soluble vitamins relate to almost everything else such as minerals as calcium and magnesium that also are found in animal foods (Calcium: Nutrient Combination and Ratios); the relationship of seafood with sodium, magnesium, and potassium is central (On Salt: Sodium, Trace Minerals, and Electrolytes). These animal foods used to hold the prized position in the human diet and the earlier hominid diet as well, as part of our evolutionary inheritance from millions of years of adaptation to a world where fatty animals once were abundant (J. Tyler Faith, John Rowan & Andrew Du, Early hominins evolved within non-analog ecosystems). That was definitely true in the paleolithic before the megafauna die-off, but even to this day hunter-gatherers when they have access to traditional territory and prey will seek out the fattest animals available, entirely ignoring lean animals because rabbit sickness is worse than hunger (humans can always fast for many days or weeks, if necessary).

It wasn’t only fat-soluble vitamins that were lost, though. Humans traditionally ate nose-to-tail and this brought with it a plethora of nutrients, even some thought of as being only sourced from plant foods (in its raw or lightly cooked form, meat has more than enough vitamin C for a low-carb diet; whereas a high-carb diet, since glucose competes with vitamin C, requires higher intake of this nutrient; see Sailors’ Rations, a High-Carb Diet; also consider that prebiotics can be found in animal foods as well and animal-based prebiotics likely feeds a very different kind of microbiome that could shift so much else in the body, such as neurotransmitter production: “I found this list of prebiotic foods that were non-carbohydrate that included cellulose, cartilage, collagen, fructooligosaccharides, glucosamine, rabbit bone, hair, skin, glucose. There’s a bunch of things that are all — there’s also casein. But these tend to be some of the foods that actually have some of the highest prebiotic content,” from Vanessa Spina as quoted in Fiber or Not: Short-Chain Fatty Acids and the Microbiome). Let me briefly mention fat-soluble vitamins again in making a point about other animal-based nutrients. Fat-soluble vitamins, similar to ketosis and autophagy, have a profound effect on human biological functioning, including that of the mind (see the work of Weston A. Price as discussed in Health From Generation To Generation; also see the work of those described in Physical Health, Mental Health). In many ways, they are closer to hormones than mere nutrients, as they orchestrate entire systems in the body and how other nutrients get used, particularly seen with vitamin K2 that Weston A. Price discovered in calling it “Activator X” (only found in animal and fermented foods, not in whole or industrially-processed plant foods). I bring this up because some other animal-based nutrients play a similar important role. Consider glycine that is the main amino acid in collagen. It is available in connective tissues and can be obtained through soups and broths made from bones, skin, ligaments, cartilage, and tendons. Glycine is right up there with the fat-soluble vitamins in being central to numerous systems, processes, and organs.

As I’ve already discussed glutamate at great length, let me further that discussion by pointing out a key link. “Glycine is found in the spinal cord and brainstem where it acts as an inhibitory neurotransmitter via its own system of receptors,” writes Afifah Hamilton. “Glycine receptors are ubiquitous throughout the nervous system and play important roles during brain development. [Ito, 2016] Glycine also interacts with the glutaminergic neurotransmission system via NMDA receptors, where both glycine and glutamate are required, again, chiefly exerting inhibitory effects” (10 Reasons To Supplement With Glycine). Hamilton elucidates the dozens of roles played by this master nutrient and the diverse conditions that follow from its deprivation or insufficiency — it’s implicated in obsessive compulsive disorder, schizophrenia, and alcohol use disorder, along with much else such as metabolic syndrome. But it’s being essential to glutamate really stands out for this discussion. “Glutathione is synthesised,” Hamilton further explains, “from the amino acids glutamate, cysteine, and glycine, but studies have shown that the rate of synthesis is primarily determined by levels of glycine in the tissue. If there is insufficient glycine available the glutathione precursor molecules are excreted in the urine. Vegetarians excrete 80% more of these precursors than their omnivore counterparts indicating a more limited ability to complete the synthesis process.” Did you catch what she is saying there? Autistics already have too much glutamate and, if they are deficient in glycine, they won’t be able to convert glutamate into the important glutathione. When the body is overwhelmed with unused glutamate, it does what it can to eliminate them, but when constantly flooded with high-glutamate intake it can’t keep up.

The whole mess of the agricultural diet, specifically in its modern industrialized form, has been a constant onslaught taxing our bodies and minds. And the consequences are worsening with each generation. What stands out to me about autism, in particular, is how isolating it is. The repetitive behavior and focus on objects resonates with extreme addiction. As with other conditions influenced by diet (shizophrenia, ADHD, etc), both autism and addiction block normal human relating in creating an obsessive mindset that, in the most most extreme forms, blocks out all else. I wonder if all of us moderns are simply expressing milder varieties of this biological and neurological phenomenon. And this might be the underpinning of our hyper-individualistic society, with the earliest precursors showing up in the Axial Age following what Julian Jaynes hypothesized as the breakdown of the much more other-oriented bicameral mind. What if our egoic consciousness with its rigid psychological boundaries is the result of our food system, as part of the civilizational project of mass agriculture?

* * *

Mongolian Diet and Fasting:

For anyone who is curious to learn more, the original point of interest for me was a quote by Jack Weatherford in his book Genghis Khan and the Making of the Modern World: “The Chinese noted with surprise and disgust the ability of the Mongol warriors to survive on little food and water for long periods; according to one, the entire army could camp without a single puff of smoke since they needed no fires to cook. Compared to the Jurched soldiers, the Mongols were much healthier and stronger. The Mongols consumed a steady diet of meat, milk, yogurt, and other diary products, and they fought men who lived on gruel made from various grains. The grain diet of the peasant warriors stunted their bones, rotted their teeth, and left them weak and prone to disease. In contrast, the poorest Mongol soldier ate mostly protein, thereby giving him strong teeth and bones. Unlike the Jurched soldiers, who were dependent on a heavy carbohydrate diet, the Mongols could more easily go a day or two without food.” By the way, that biography was written by an anthropologist who lived among and studied the Mongols for years. It is about the historical Mongols, but filtered through the direct experience of still existing Mongol people who have maintained a traditional diet and lifestyle longer than most other populations. It isn’t only that their diet was ketogenic because of being low-carb but also because it involved fasting.

From Mongolia Volume 1 The Tangut Country, and the Solitudes of Northernin (1876), Nikolaĭ Mikhaĭlovich Przhevalʹskiĭ writes in the second note on p. 65 under the section Calendar and Year-Cycle: “On the New Year’s Day, or White Feast of the Mongols, see ‘Marco Polo’, 2nd ed. i. p. 376-378, and ii. p. 543. The monthly fetival days, properly for the Lamas days of fasting and worship, seem to differ locally. See note in same work, i. p. 224, and on the Year-cycle, i. p. 435.” This is alluded to in another text, in describing that such things as fasting were the norm of that time: “It is well known that both medieval European and traditional Mongolian cultures emphasized the importance of eating and drinking. In premodern societies these activities played a much more significant role in social intercourse as well as in religious rituals (e.g., in sacrificing and fasting) than nowadays” (Antti Ruotsala, Europeans and Mongols in the middle of the thirteenth century, 2001). A science journalist trained in biology, Dyna Rochmyaningsih, also mentions this: “As a spiritual practice, fasting has been employed by many religious groups since ancient times. Historically, ancient Egyptians, Greeks, Babylonians, and Mongolians believed that fasting was a healthy ritual that could detoxify the body and purify the mind” (Fasting and the Human Mind).

Mongol shamans and priests fasted, no different than in so many other religions, but so did other Mongols — more from Przhevalʹskiĭ’s 1876 account showing the standard feast and fast cycle of many traditional ketogenic diets: “The gluttony of this people exceeds all description. A Mongol will eat more than ten pounds of meat at one sitting, but some have been known to devour an average-sized sheep in twenty-four hours! On a journey, when provisions are economized, a leg of mutton is the ordinary daily ration for one man, and although he can live for days without food, yet, when once he gets it, he will eat enough for seven” (see more quoted material in Diet of Mongolia). Fasting was also noted of earlier Mongols, such as Genghis Khan: “In the spring of 2011, Jenghis Khan summoned his fighting forces […] For three days he fasted, neither eating nor drinking, but holding converse with the gods. On the fourth day the Khakan emerged from his tent and announced to the exultant multitude that Heaven had bestowed on him the boon of victory” (Michael Prawdin, The Mongol Empire, 1967). Even before he became Khan, this was his practice as was common among the Mongols, such that it became a communal ritual for the warriors:

“When he was still known as Temujin, without tribe and seeking to retake his kidnapped wife, Genghis Khan went to Burkhan Khaldun to pray. He stripped off his weapons, belt, and hat – the symbols of a man’s power and stature – and bowed to the sun, sky, and mountain, first offering thanks for their constancy and for the people and circumstances that sustained his life. Then, he prayed and fasted, contemplating his situation and formulating a strategy. It was only after days in prayer that he descended from the mountain with a clear purpose and plan that would result in his first victory in battle. When he was elected Khan of Khans, he again retreated into the mountains to seek blessing and guidance. Before every campaign against neighboring tribes and kingdoms, he would spend days in Burhkhan Khandun, fasting and praying. By then, the people of his tribe had joined in on his ritual at the foot of the mountain, waiting his return” (Dr. Hyun Jin Preston Moon, Genghis Khan and His Personal Standard of Leadership).

As an interesting side note, the Mongol population have been studied to some extent in one area of relevance. In Down’s Anomaly (1976), Smith et al writes that, “The initial decrease in the fasting blood sugar was greater than that usually considered normal and the return to fasting blood sugar level was slow. The results suggested increased sensitivity to insulin. Benda reported the initial drop in fating blood sugar to be normal but the absolute blood sugar level after 2 hours was lower for mongols than for controls.” That is probably the result of a traditional low-carb diet that had been maintained continuously since before history. For some further context, I noticed some discusion about the Mongolian keto diet (Reddit, r/keto, TIL that Ghenghis Khan and his Mongol Army ate a mostly keto based diet, consisting of lots of milk and cheese. The Mongols were specially adapted genetically to digest the lactase in milk and this made them easier to feed.) that was inspired by the scientific documentary “The Evolution of Us” (presently available on Netflix and elsewhere).

* * *

3/30/19 – An additional comment: I briefly mentioned sugar, that it causes a serotonin high and activates the hedonic pathway. I also noted that it was late in civilization when sources of sugar were cultivated and, I could add, even later when sugar became cheap enough to be common. Even into the 1800s, sugar was minimal and still often considered more as medicine than food.

To extend this thought, it isn’t only sugar in general but specific forms of it. Fructose, in particular, has become widespread because of United States government subsidizing corn agriculture which has created a greater corn yield that humans can consume. So, what doesn’t get fed to animals or turned into ethanol, mostly is made into high fructose corn syrup and then added into almost every processed food and beverage imaginable.

Fructose is not like other sugars. This was important for early hominid survival and so shaped human evolution. It might have played a role in fasting and feasting. In 100 Million Years of Food, Stephen Le writes that, “Many hypotheses regarding the function of uric acid have been proposed. One suggestion is that uric acid helped our primate ancestors store fat, particularly after eating fruit. It’s true that consumption of fructose induces production of uric acid, and uric acid accentuates the fat-accumulating effects of fructose. Our ancestors, when they stumbled on fruiting trees, could gorge until their fat stores were pleasantly plump and then survive for a few weeks until the next bounty of fruit was available” (p. 42).

That makes sense to me, but he goes on to argue against this possible explanation. “The problem with this theory is that it does not explain why only primates have this peculiar trait of triggering fat storage via uric acid. After all, bears, squirrels, and other mammals store fat without using uric acid as a trigger.” This is where Le’s knowledge is lacking for he never discusses ketosis that has been centrally important for humans unlike other animals. If uric acid increases fat production, that would be helpful for fattening up for the next starvation period when the body returned to ketosis. So, it would be a regular switching back and forth between formation of uric acid that stores fat and formation of ketones that burns fat.

That is fine and dandy under natural conditions. Excess fructose, however, is a whole other matter. It has been strongly associated with metabolic syndrome. One pathway of causation is that increased production of uric acid. This can lead to gout but other things as well. It’s a mixed bag. “While it’s true that higher levels of uric acid have been found to protect against brain damage from Alzheimer’s, Parkinson’s, and multiple sclerosis, high uric acid unfortunately increases the risk of brain stroke and poor brain function” (p. 43).

The potential side effects of uric acid overdose are related to other problems I’ve discussed in relation to the agricultural mind. “A recent study also observed that high uric acid levels are associated with greater excitement-seeking and impulsivity, which the researchers noted may be linked to attention deficit hyperactivity disorder (ADHD)” (p. 43). The problems of sugar go far beyond mere physical disease. It’s one more factor in the drastic transformation of the human mind.

* * *

4/2/19 – More info: There are certain animal fats, the omega-3 fatty acids EPA and DHA, that are essential to human health. These were abundant in the hunter-gatherer diet. But over the history of agriculture, they have become less common.

This is associated with psychiatric disorders and general neurocognitive problems, including those already mentioned above in the post. Agriculture and industrialization have replaced these healthy lipids with industrially-processed seed oils that are high in linoleic acid (LA), an omega-6 fatty acids. LA interferes with the body’s use of omega-3 fatty acids. Worse still, these seed oils appear to be mutagenic, a possible causal factor behind conditions like autism (Dr. Catherine Shanahan On Dietary Epigenetics and Mutations).

The loss of healthy animal fats in the diet might be directly related to numerous conditions. “Children who lack DHA are more likely to have increased rates of neurological disorders, in particular attention deficit hyperactivity disorder (ADHD), and autism” (Maria Cross, Why babies need animal fat).

“Biggest dietary change in the last 60 years has been avoidance of animal fat. Coincides with a huge uptick in autism incidence. The human brain is 60 percent fat by weight. Much more investigation needed on correspondence between autism and prenatal/child ingestion of dietary fat.”
~ Brad Lemley

The Brain Needs Animal Fat
by Georgia Ede

Maternal Dietary Fat Intake in Association With Autism Spectrum Disorders
by Kristen Lyall et al

“Maternal intake of fish, a key source of fatty acids, has been investigated in association with child neurodevelopmental outcomes in several studies. […]

“Though speculative at this time, the inverse association seen for those in the highest quartiles of intake of ω-6 fatty acids could be due to biological effects of these fatty acids on brain development. PUFAs have been shown to be important in retinal and brain development in utero (37) and to play roles in signal transduction and gene expression and as components of cell membranes (38, 39). Maternal stores of fatty acids in adipose tissue are utilized by the fetus toward the end of pregnancy and are necessary for the first 2 months of life in a crucial period of development (37). The complex effects of fatty acids on inflammatory markers and immune responses could also mediate an association between PUFA and ASD. Activation of the maternal immune system and maternal immune aberrations have been previously associated with autism (5, 40, 41), and findings suggest that increased interleukin-6 could influence fetal brain development and increase risk of autism and other neuropsychiatric conditions (42–44). Although results for effects of ω-6 intake on interleukin-6 levels are inconsistent (45, 46), maternal immune factors potentially could be affected by PUFA intake (47). […]

“Our results provide preliminary evidence that increased maternal intake of ω-6 fatty acids could reduce risk of offspring ASD and that very low intakes of ω-3 fatty acids and linoleic acid could increase risk.”

* * *

6/13/19 – About the bicameral mind, I saw some other evidence for it in relationship to fasting. In the following quote, it is described that after ten days of fasting ancient humans would experience spirits. One thing for certain is that one can be fully in ketosis in three days. This would be true even if it wasn’t total fasting, as the caloric restriction would achieve the same end.

The author, Michael Carr, doesn’t think fasting was the cause of the spirit visions, but he doesn’t explain the reason(s) for his doubt. There is a long history of fasting used to achieve this intended outcome. If fasting was ineffective for this purpose, why has nearly every known traditional society for millennia used such methods? These people knew what they were doing.

By the way, imbibing alcohol after the fast would really knock someone into an altered state. The body becomes even more sensitive to alcohol when in ketogenic state during fasting. Combine this altered state with ritual, setting, cultural expectation, and archaic authorization. I don’t have any doubt that spirit visions could easily be induced.

Reflections on the Dawn of Consciousness
ed. by Marcel Kuijsten
Kindle Location 5699-5718

Chapter 13
The Shi ‘Corpse/ Personator’ Ceremony in Early China
by Michael Carr

“”Ritual Fasts and Spirit Visions in the Liji” 37 examined how the “Record of Rites” describes zhai 齋 ‘ritual fasting’ that supposedly resulted in seeing and hearing the dead. This text describes preparations for an ancestral sacrifice that included divination for a suitable day, ablution, contemplation, and a fasting ritual with seven days of sanzhai 散 齋 ‘relaxed fasting; vegetarian diet; abstinence (esp. from sex, meat, or wine)’ followed by three days of zhizhai 致 齋 ‘strict fasting; diet of grains (esp. gruel) and water’.

“Devoted fasting is inside; relaxed fasting is outside. During fast-days, one thinks about their [the ancestor’s] lifestyle, their jokes, their aspirations, their pleasures, and their affections. [After] fasting three days, then one sees those [spirits] for whom one fasted. On the day of the sacrifice, when one enters the temple, apparently one must see them at the spirit-tablet. When one returns to go out the door [after making sacrifices], solemnly one must hear sounds of their appearance. When one goes out the door and listens, emotionally one must hear sounds of their sighing breath. 38

“This context unequivocally uses biyou 必 有 ‘must be/ have; necessarily/ certainly have’ to describe events within the ancestral temple; the faster 必 有 見 “must have sight of, must see” and 必 有 聞 “must have hearing of, must hear” the deceased parent. Did 10 days of ritual fasting and mournful meditation necessarily cause visions or hallucinations? Perhaps the explanation is extreme or total fasting, except that several Liji passages specifically warn against any excessive fasts that could harm the faster’s health or sense perceptions. 39 Perhaps the explanation is inebriation from drinking sacrificial jiu 酒 ‘( millet) wine; alcohol’ after a 10-day fast. Based on measurements of bronze vessels and another Liji passage describing a shi personator drinking nine cups of wine, 40 York University professor of religious studies Jordan Paper   calculates an alcohol equivalence of “between 5 and 8 bar shots of eighty-proof liquor.” 41 On the other hand, perhaps the best explanation is the bicameral hypothesis, which provides a far wider-reaching rationale for Chinese ritual hallucinations and personation of the dead.”

* * *

7/16/19 – One common explanation for autism is the extreme male brain theory. A recent study may have come up with supporting evidence (Christian Jarrett, Autistic boys and girls found to have “hypermasculinised” faces – supporting the Extreme Male Brain theory). Autistics, including females, tend to have hypermasculinised. This might be caused by greater exposure to testosterone in the womb.

This made my mind immediately wonder how this relates. Changes in diets alter hormonal functioning. Endocrinology, the study of hormones, has been a major part of the diet debate going back to European researchers from earlier last century (as discussed by Gary Taubes). Diet affects hormones and hormones in turn affect diet. But I had something more specific in mind.

What about propionate and glutamate? What might their relationship be to testosterone? In a brief search, I couldn’t find anything about propionate. But I did find some studies related to glutamate. There is an impact on the endocrine system, although these studies weren’t looking at the results in terms of autism specifically or neurocognitive development in general. It points to some possibilities, though.

One could extrapolate from one of these studies that increased glutamate in the pregnant mother’s diet could alter what testosterone does to the developing fetus, in that testosterone increases the toxicity of glutamate which might not be a problem under normal conditions of lower glutamate levels. This would be further exacerbated during breastfeeding and later on when the child began eating the same glutamate-rich diet as the mother.

Testosterone increases neurotoxicity of glutamate in vitro and ischemia-reperfusion injury in an animal model
by Shao-Hua Yang et al

Effect of Monosodium Glutamate on Some Endocrine Functions
by Yonetani Shinobu and Matsuzawa Yoshimasa

Gender and Personality on the Autism Spectrum

There is ongoing debate about autism, such as how it is defined and what causes it, which in turn leads to how it is and should be diagnosed. Some have speculated that autism in girls and women is underdiagnosed:

However, it’s unclear whether this gender bias is the result of genetics or reflects differences in diagnosis or the way females manifest symptoms of the disorder. Girls with autism tend to actively compensate for their symptoms in ways that boys don’t, which may account for the discrepancy, says Skuse.

As a result, the females enrolled in studies may tend to be severely affected and carry multiple mutations. “There is some suggestion that higher-functioning females are out there in the general population, but they’re not being referred,” he says.

Here is what one could argue. Maybe it is most likely that the bias is not just in diagnosis for there would be a directly related bias in the research itself. After all, it is diagnosis that determines the subjects in the autism studies. So, if diagnosis is biased, there is no reason to assume that the subjects are representative of the full autistic population. Biased input would inevitably lead to biased results and hence biased conclusions. Basically, these studies at present might not be able to tell us anything about possible gender differences.

A reason given for the alleged failure to detect female autism is that “it may be because girls are better at masking the symptoms – better at copying social norms while not necessarily understanding them.” That might be true of many boys and men as well.

I have some Asperger’s-like traits, although I’ve never been diagnosed. Maybe it’s because I learned to fit in. I was socially clueless when younger and social situations stress me out, a set of factors exacerbated by my inner-focused nature. I don’t connect easily with others. But you wouldn’t notice that from casually interacting with me. I know how to pretend to be normal. It’s maybe why therapy has never worked for me, as I’ve developed a habit of effectively hiding my problems. It’s a survival mechanism that I learned young.

What occurs to me is that I’m a Jungian Feeling type. Myers-Briggs testing has found that most Feeling types are female, although about 30% are male. The same pattern in the opposite direction is seen with Thinking types. There is a general pattern that follows along gender lines. Still, that approximate third of the population is a significant number. That might mean that a third of male autistics don’t fit into the male pattern, maybe while a third of female autistics do.

So the seeming gender difference found in autism could be more about personality differences. And those personality differences may or may not be genetic in nature. Much of this could instead be culturally learned behavior. It wouldn’t only be cultural biases in diagnosis of autism for, if that is so, it would also be cultural biases in how autism is expressed. In that case, the question is what might be the relationship between culture, personality, gender, and neurocognitive development. There are obviously many complex factors involved, such as considering how a significant number of people don’t fall into simple gender categories: “It’s far from uncommon for people to carry genetics of both sexes, even multiple DNA.” Since gender isn’t binary, the expressions of autism presumably also wouldn’t be binary.

It would be easy to test my speculation if formulated as a hypothesis. My prediction would be that Thinking type females would be more likely to be diagnosed as autstic. And the opposite prediction would be that Feeling type males would be less likely. That is simply to say that autism would express differently depending on personality traits/functions. Similar research could be done with FFM/Big Five, and maybe such research already has been done. A related issue that would need to be disentangled is whether autism is more common among certain personalities or simply more diagnosed among certain personalities, an issue that could be studied either in relation to or separate from gender.

All of this is particularly complicated for certain Myers-Briggs types. My specific type is INFP. This type is one of the most talented types when it comes to masking behavior, “known as being inscrutable.” As Carl Jung described dominant introverted feeling (what Myers-Briggs divides into two types: INFP and ISFP):

They are mostly silent, inaccessible, hard to understand; often they hide behind a childish or banal mask, and their temperament is inclined to melancholy…Their outward demeanor is harmonious, inconspicuous…with no desire to affect others, to impress, influence or change them in any way. If this is more pronounced, it arouses suspicion of indifference and coldness…Although there is a constant readiness for peaceful and harmonious co-existence, strangers are shown no touch of amiability, no gleam of responsive warmth…It might seem on a superficial view that they have no feelings at all.
(Psych. Types, Para. 640-641)

An INFP aspie would make for a rather confusing specimen. It is the dominant introverted feeling that is so hard to discern. And this introverted feeling is hidden behind the chameleon-like and outward-facing extraverted intuition, what is in the position called the auxiliary function. Extraverted intuition is the ultimate mask to hide behind, as it is highly fluid and adaptable. And as the auxiliary function, extraverted intuition plays the role of mediation with and defense against the outside world.

Maybe a significant number of autistics have hidden introverted feeling. This would fit the autistic pattern of feeling strongly in response to others (high functioning affective empathy) while not easily connecting to others (low functioning cognitive empathy). By its nature, there is no straightforward way for introverted feeling to be expressed in social behavior. Yet an INFP can be talented at learning normal social behavior, as extraverted intuition helps them to be mimics. Or failing that, they could stonewall anyone trying to figure them out. Usually being conflict avoidant, most dominant introverted feeling types will go along to get along, as long as no one treads on their core sense of value.

Here is a more general point:

I think it’s a bit silly to make a distinction between “male” and “female” interests in the first place and realize that it can also be healthy for women to take interest in more traditionally “male” subjects such as science and technology and that doesn’t always mean that they have a disorder. In making a diagnosis they should always be aware of the underlying pattern rather than the actual interest and keep in mind that interests may differ for each individual, so (e.g.) whether a female is obsessively talking about computers or fashion should not matter, because the pattern is the same. Indeed, it probably is more obvious in the first case, especially when society is more geared toward male/female stereotyping [so “masculine” interests for women stand out]. And besides, narrow interests is but 1 clue, it doesn’t count for every individual with an ASD; they may have a range of interests, just as typical people do.

Also, as some typologists argue, the US has been an society dominated by ESTJ types that is becoming dominated by ENTJ types (John Giannini, Compass of the Soul). The commonality then is E_TJ, that is to say dominant extraverted thinking. This typological bias is how American culture defines and enforces the social norms of the male gender. Unsurprisingly, that would also be how autism gets diagnosed, according to extraversion and thinking.

On the other hand, autism that was introverted and/or feeling would express in entirely different ways. In particular, dominant introverted feeling would express as strong affective empathy, rather than the (stereotypically) masculine tendency toward emotional detachment. Also, introversion taken on its own, whether in relation to feeling or thinking, would be more internalized and hence less observable — meaning obsessions that would be unlikely to seen in outward behavior: more subtle and nuanced or else more hidden and masked.

This personality perspective might be immensely more helpful than using a gender lens alone. It’s also a more psychologically complex frame of interpretation, appealing to my personality predilections. Considering that autism and Asperger’s was originally observed and defined by men, one might wonder what kind of personalities they had. Their personalities might have determined which personalities they were drawn to in studying and hence drawn to in using as the standard for their early models of the autism spectrum.

Aspergers and Chunking

I was reading Ungifted: Intelligence Redefined by Scott Barry Kaufman. I came across a section about Aspergers. The more I’ve read about it over the years the more I suspect that I have some form of it.*

A theory on Autism is that it is strong focus on details which can lead to not seeing the forest for the trees, but if high functioning enough this can be compensated for. The Aspie takes in so many details that this can lead to distraction and cognitive overload. There are two primary ways of dealing with this. First, Aspies might limit their interactions and narrow their focus to create a more manageable space in which to think and to feel more comfortable. Second, Aspies often learn to chunk information.

The second method is what I learned as a child when I was living in Deerfield, Illinois (a wealthy Jewish suburb of Chicago; more on this below). I was having trouble with reading and I stuttered. I had a hard time saying what a word was or even recalling the names of my friends, but I could describe what I meant when I wasn’t stuttering and the only reason I was stuttering was because I couldn’t recall.

I went to speech therapy, but even the therapist wasn’t sure my precise problem. This therapist and my mom, who also was a speech therapist, went to a talk given by Diane J. German from Northwestern University who maybe was working on her PhD dissertation at the time (my mom thinks this was in 1982 since I was diagnosed in first grade when I was 6 years old). She is now a professor emeritus at National Louis University. At the time, German was working on a new test for word recall issues. Here is an article about her work:

“The look on these children’s faces captures the problem in the most compelling way,” says Diane German, the principal researcher, who specializes in disorders of word-finding and a special education professor at National-Louis University in Chicago, Illinois. “They really struggle when they have to read a simple word like ‘nest’ out loud. Some grimace, others look stuck. Some just blurt out an answer that’s almost always wrong. Yet when asked to point to the same word on a page, they almost always get it right. Clearly they’ve got a problem and need help, but it’s not that they lack reading skills.”

One child in the study, previously diagnosed with these “word-finding” difficulties, couldn’t say “cocoon” as he tried to read a story aloud. When he got to the word, he stumbled and added, “You know, it is that brown thing hanging in the tree.”

“Clearly, this child had managed to ‘read’ the word to himself and comprehend it, or he could never have come up with that kind of description,” explains psychologist Rochelle Newman, co-author of the study and a University of Maryland professor of hearing and speech sciences. “He just couldn’t retrieve the sound pattern of the word.”

(Another piece by her: “Ask Yourself, Are You Doing Enough for Your Learners with Word Finding Difficulties?)

They immediately recognized that German was talking about my issues. German was looking to do a study. So, my mom did some of the testing for German’s study, but my mom recalls German coming to our house and testing me herself. That is how I became one of the kids used as a subject in her study. And that was the beginning of how I, unlike so many other kids, escaped the trap of sub-par remedial education and a life of low expectations.

My mom and the therapist learned about this new field of word recall issues. Before that time, no one was discussing any of this and speech therapists weren’t being taught about it. It was serendipity that I was beginning school at the time and nearby where this new field was being developed. With this new knowledge, my mom worked with my therapist to help me with word recall (along with a learning disability therapist, Diane Redfield, who taught me to read).

One of the things that helped me the most was the information chunking. My mom explained that this had to do with not just grouping similar words. It has to do with looking at words from every angle in order to understand its different aspects. It is a shifting of perspective and a breaking down into component parts. This is what allows word groupings to be useful. Grouping words goes hand in hand with chunking information. The more kinds of groupings and chunkings the increased capacity to think and communicate clearly.

I had an example of this just last night. I was thinking of early 20th century anarchists and I was trying to recall one specific person. From the word ‘anarchist’, I thought of women’s clinic. Then from that I connected to the last name Goldman. Once I had the last name, I could recall the first name and so had the full name: Emma Goldman. I couldn’t just pull the name out by itself. I had to go through a process to get to it.

That isn’t my only method. I also use something similar to chunking that is more on a feeling level. I get an overall sense of something, a person or an idea or whatever. Once I have that sense, I just have to switch into the right state of mind and slowly feel into it. Anything I’m familiar with has a feeling-sense associated with it. This form of recall isn’t always efficient, but it works when I can’t use a direct chain of connections. This feeling-sense is very useful in general, though, for it allows me to chunk info in larger ways and helps me in feeling out patterns by sensing resonances.

All of this fits into why I’ve come to suspect I have Aspergers or something very similar. The one thing that demonstrated I wasn’t low IQ as a child was my ability to see patterns. This is also a talent of many Aspies. It is because Aspies see things in chunks of details that they are able to more flexibly scan for patterns. It is precisely where various chunks crossover that a whole begins to form, but this is building from the bottom up.

I do this in my thinking and writing. When taken to its extreme, I call them thought-webs. Connections form, connections build upon connections, and then a sense of meaning emerges from that. It is an organic process of synthesizing, rather than analyzing, although analyzing may follow as a secondary process. It is looking to the data to speak for itself, finding the harmony between the seemingly diparate.

It has its strengths and weaknesses. It is greatest strength is for research. My Asperger-like extraverted intuition (MBTI Ne) goes off in a million directions finding all the details until my brain is overloaded. Then begins the filtering and consolidating of it all into a unique synthesis, but that last part can be a doozy. I sometimes never get past the brain overload.

* More recently, I’ve learned of specific language impairment. It can have behavioral symptoms similar to autism, but it’s a different condition and much more common. It’s another possibility in describing my own difficulties, as much of it fits my experience.

As a side note, there is a reason I mentioned above that Deerfield is a wealthy Jewish suburb of Chicago. Here is an interesting detail of Deerfield’s history (from Wikipedia):

“In 1959, when Deerfield officials learned that a developer building a neighborhood of large new homes planned to make houses available to African Americans, they issued a stop-work order. An intense debate began about racial integration, property values, and the good faith of community officials and builders. For a brief time, Deerfield was spotlighted in the national news as “the Little Rock of the North.” Supporters of integration were denounced and ostracized by angry residents. Eventually, the village passed a referendum to build parks on the property, thus putting an end to the housing development. Two model homes already partially completed were sold to village officials. The remaining land lay dormant for years before it was developed into what is now Mitchell Pool and Park and Jaycee Park. At the time, Deerfield’s black population was 12 people out of a total population of 11,786. This episode in Deerfield’s history is described in But Not Next Door by Harry and David Rosen, both residents of Deerfield.
“Since the early 1980s, however, Deerfield has seen a large influx of Jews and, more recently, Asians and Greeks, giving the community a more diverse ethnic makeup.”

I guess it was a wealthy Jewish suburb of Chicago that has become a wealthy Jewish, Asian and Greek suburb of Chicago.

I can tell you one thing for certain. Few poor kids, especially poor minorities, are privileged in the way I was by my early education opportunities. I went to a public school in Deerfield, but that is way different than going to a public school in the inner city of Chicago. If I had been a poor black kid in a poor black neighborhood, I would have been designated low IQ and that would have been the end of it.

How many poor black kids failing in school are as intelligent as I am? The evidence points to the answer being many.

It is one thing to experience something like a learning disability or Aspergers. It is a whole other matter to deal with a learning disability or Aspergers while dealing with poverty and prejudice.

Even ignoring racism, classism by itself is a powerful form of prejudice. My mom was raised working class and she raised us with a working class sensibility. This meant she dressed us working class. My older brother was ridiculed in the Deerfield public school. It scarred him for life and it contributed to his hatred of school ever after. Part of that had to do with our having previously lived in Bellefontaine, Ohio which is a factory town at the edge of Appalachia. Apparently, we had picked up a bit of Appalachian speech, in that the rich kids in Deerfield ridiculed Clay for saying ‘zeero’ when meaning ‘zero’.

It was a clear giveaway to our class background. So, even though we were technically upper middle class because my dad was a factory manager, we were new money upper middle class and the other kids knew it. I was, at that time, fortunate enough to have been too young to understand and maybe, because of my Aspergers, too socially oblivious to care.

If such minor forms of prejudice could have such powerful impact on my brother, imagine what more severe (and systemic) forms of prejudice will do to a child. To this day, my brother remains traumatized from his childhood experience of class prejudice and, sadly, has internalized it in ridiculing his ‘white trash’ neighbors in the small working class town he now lives in. Racism and classism, they are shitty mentalities that cause much damage, but unless you’ve been on the receiving end of prejudice it is hard to understand and appreciate.

* * *

Below is part of the section from Ungifted where Aspergers is discussed.

pp. 223-226:

An alternative perspective, which has gained a lot of research support in recent years, is that autism is merely a different way of processing incoming information. 23 Individuals with ASD have a greater attention to detail and tend to adopt a bottom-up strategy— they first perceive the parts of an object and then build up to the whole. 24 As Uta Frith puts it, people with autism have difficulty “seeing the forest for the trees.” There is neurological evidence that the unique mind of the person with ASD is due in part to an excessive number of short-distance, disorganized local connections in the prefrontal cortex (required for attention to detail) along with a reduced number of long-range or global connections necessary for integrating information from widespread and diverse brain regions. 25 As a result, people with high-functioning autism tend to have difficulty switching attention from the local to the global level. 26

This sometimes plays itself out in social communications. People with ASD focus on details in the environment most people find “irrelevant,” which can lead to some awkward social encounters. When people with ASD are shown photographs with social information (such as friends chatting) or movie clips from soap operas, their attention is focused much less on the people’s faces and eyes than the background scenery, such as light switches. 27 Differences among toddlers in attention to social speech is a robust predictor of ASD, and social attention differences in preschool lead to a deficit in theory of mind. 28 This is important , considering that an early lack of attention to social information can deprive the developing child of the social inputs and learning opportunities they require to develop expertise in social cognition. 29 It’s likely that from multiple unrewarding social interactions during the course of development, people with ASD learn that social interactions are unrewarding, and retreat even further into themselves.

Kate O’Connor and Ian Kirk argue that the atypical social behaviors found in people with ASD are more likely the result of a processing difference than a social deficit, and may represent a strategy to filter out too much sensory information . 30 Indeed , people with ASD often report emotional confusion during social interactions, in which they interpret expressions, gestures, and body language to mean something different from or even the opposite of what the other person intended. 31 Many people with ASD report that the eye region is particularly “confusing” and “frightening.” 32

Indeed, the eye region is very complex, transmitting a lot of information in a brief time span. For one thing, it’s always in motion (blinking, squinting, saccadic movement, and so on). But the eye region also can depict a wide range of emotions in rapid succession. It’s likely that over the course of many overwhelming interactions with people in the context of other sensory information coming in from the environment, people with ASD learn to look less at the eye region of faces. 33 People with ASD do frequently report being distracted by sensory information in the environment, including background noise, fluorescent light, shiny objects, body movement, and smells. 34

[ . . . ]

One robust finding is that people with ASD have enhanced perceptual functioning. 40 People with ASD tend to perform better than people without ASD symptoms on IQ subtests that involve nonverbal fluid reasoning and the segmentation and reconstruction of novel visual designs. 41 Individuals with ASD also perform better than controls on the Embedded Figures Task (EFT), which requires quick detection of a target within a complex pattern. 42 The ASD tendency to see patterns as collections of details instead of as wholes helps people with ASD to segment and chunk visual information, freeing up visual working memory resources and allowing them to handle a higher perceptual load than typical adults. 43