Corporate Control, from the EU to the US

There is a recent incident of the EU putting out corporate propaganda. An EU report directly plagiarized a paper written by big ag, in ensuring the public that glyphosate (Roundup) is a healthy additive to your family’s diet and so there is no need to strictly regulate it.

“The BfR [Germany’s Federal Institute for Risk Assessment commissioned by the EU] had thus copied Monsanto’s explanation of Monsanto’s approach in evaluating the published literature, yet had presented it as the approach of the authority. This is a striking example of deception regarding true authorship.”
(Joseph Mercola, EU Infiltrated by Pesticide Industry Plagiarizes Safety Study)

Don’t worry about it. Monsanto’s products are safe and good. How do we know? Because Monsanto told us so. It’s amazing they get away this kind of thing. And they do it all the time.

Corporate lobbyists regularly have direct influence over politicians. They even sometimes write the bills that get passed into laws. And that is on top of regulatory capture, revolving doors, legalized bribery, etc. I don’t know why we tolerate this. It’s so often done brazenly, as if they are rubbing our faces in it, daring us to try to stop them, as if to demonstrate to us that we are powerless and so we should just cynically accept our subordinate position.

I’m so often reminded of the actions of the East India Company prior to the American Revolution. They thought they were above all morality and laws, beholden to no one. They began taking on the powers of a government, as they piggybacked on British imperialism. That was the first era when corporatism took hold in the Anglo-American world.

It shouldn’t surprise any of us by now. Think about it.

Western governments on behalf of corporations have regularly harmed and killed millions of innocents through trade agreements, sanctions, wars of aggression, coups, training paramilitary groups, etc in order to ensure for corporations access to trade routes, natural resources, and cheap labor (e.g., Hillary Clinton as Secretary of State intervened in Haiti to drive down wages so as to maintain cheap labor for US corporations, which is why so many Haitian-Americans voted for Trump and helped him to win Florida). A governing body like the EU putting out corporate propaganda is a small act in the big scheme.

Our governments, especially in the US, don’t represent the citizenry. Generations of attempts at reform from within the system have mostly failed, although a few successes here and there. The US government is more corporatist now than at any prior point in history. Yet every election cycle candidates in both parties promise all kinds of things. That doesn’t stop the system from continuing on as before in serving big biz, as scientific studies have shown. If more of the same keeps resulting in more of the same, maybe it’s time we did something different.

The majority of the American public has been steadily moving left in their policy positions for decades. At this point, the average American is to the left of both parties on many major issues. When some political, media, or think tank elite speaks of ‘centrism’ and ‘moderation’, ask yourself what is the defining frame? Well, obviously they mean moderating toward the center of power, not moderating toward the center of majority support. The problem is the majority doesn’t know it is a majority because the propaganda campaign has been so highly effective with near total control of the party system and corporate media.

Cracks are beginning to show, though. In the past, the gatekeepers would have so tightly controlled these issues that the American public would rarely have heard about any of it. But the corporate media stranglehold is beginning to loosen. Or maybe some of the ruling elite are finally coming around to the sense of self-preservation that motivated a born plutocrat like Theodore Roosevelt to reign in corporate wealth and power.

* * *

‘Call It the Oppression of the Supermajority’: Americans Eager for Bold Change, So Why Can’t They Get It?
by Jake Johnson, Common Dreams

Most Americans support Medicare for All, higher taxes on the rich, a Green New Deal, and other major items on the progressive agenda—so why has Congress failed to enact them?

The reason, Columbia University Law School professor Tim Wu argued in an op-ed for the New York Times on Tuesday, is that the influence of corporations and the donor class on the American political system has drowned out the policy desires of the public.

“In our era, it is primarily Congress that prevents popular laws from being passed or getting serious consideration. (Holding an occasional hearing does not count as ‘doing something’),” Wu wrote. “Entire categories of public policy options are effectively off-limits because of the combined influence of industry groups and donor interests.”

To bolster his argument, Wu rattled off a number of policies that—despite polling extremely well among large, bipartisan swaths of the American public—have not garnered enough support among lawmakers to pass Congress.

“About 75 percent of Americans favor higher taxes for the ultra-wealthy. The idea of a federal law that would guarantee paid maternity leave attracts 67 percent support,” Wu noted. “Eighty-three percent favor strong net neutrality rules for broadband, and more than 60 percent want stronger privacy laws. Seventy-one percent think we should be able to buy drugs imported from Canada, and 92 percent want Medicare to negotiate for lower drug prices. The list goes on.”

Since the election of President Donald Trump in 2016, Congress has in many cases done the opposite of what most Americans want by slashing taxes on the richfailing to restore net neutrality rules, and attempting to strip healthcare from millions of Americans.

“The defining political fact of our time is not polarization. It’s the inability of even large bipartisan majorities to get what they want on issues like these,” argued Wu. “Call it the oppression of the supermajority. Ignoring what most of the country wants—as much as demagogy and political divisiveness—is what is making the public so angry.”

Wu’s contention that the “combined influence” of the donor class and big business is significantly responsible for Congress’ refusal to enact popular policies matches the conclusion of a 2014 study (pdf) by political scientists Martin Gilens and Benjamin Page, who found that in the United States, “the majority does not rule—at least not in the causal sense of actually determining policy outcomes.”

“When a majority of citizens disagrees with economic elites or with organized interests, they generally lose,” Gilens and Page wrote. “Moreover, because of the strong status quo bias built into the U.S. political system, even when fairly large majorities of Americans favor policy change, they generally do not get it.”

Conceptual Spaces

In a Nautilis piece, New Evidence for the Strange Geometry of Thought, Adithya Rajagopalan reports on the fascinating topic of conceptual or cognitive spaces. He begins with the work of the philosopher and cognitive scientist Peter Gärdenfors who wrote about this in a 2000 book, Conceptual Spaces. Then last year, there was published a Science paper by several neuroscientists: Jacob Bellmund, Christian Doeller, and Edvard Moser. It has to do with the brain’s “inner GPS.”

Anyone who has followed my blog for a while should see the interest this has for me. There is Julian Jaynes’ thought on consciousness, of course. And there are all kinds of other thinkers as well. I could throw out Iain McGilchrist and James L. Kugel who, though critical of Jaynes, make similar points about identity and the divided mind.

The work of Gärdenfors and the above neuroscientists helps explain numerous phenomenon, specifically in what way splintering and dissociation operates. How a Nazi doctor could torture Jewish children at work and then go home to play with his own children. How the typical person can be pious at church on Sunday and yet act in complete contradiction to this for the rest of the week. How we can know that the world is being destroyed through climate change and still go on about our lives as if everything remains the same.How we can simultaneously know and not know so many things. Et cetera.

It might begin to give us some more details in explaining the differences between the bicameral mind and Jaynesian consciousness, between Ernest Hartmann’s thin and thick boundaries of the mind, and much else. Also, in light of Lynne Kelly’s work on traditional mnemonic systems, we might be in a better position of understanding the phenomenal memory feats humans are capable of and why they are so often spatial in organization (e.g., the Songlines of Australian Aborigines) and why these often involve shifts in mental states. It might also clarify how people can temporarily or permanently change personalities and identities, how people can compartmentalize parts of themselves such as their childhood selves and maybe help explain why others fail at compartmentalizing.

The potential significance is immense. Our minds are mansions with many rooms. Below is the meat of Rajagopalan’s article.

* * *

“Cognitive spaces are a way of thinking about how our brain might organize our knowledge of the world,” Bellmund said. It’s an approach that concerns not only geographical data, but also relationships between objects and experience. “We were intrigued by evidence from many different groups that suggested that the principles of spatial coding in the hippocampus seem to be relevant beyond the realms of just spatial navigation,” Bellmund said. The hippocampus’ place and grid cells, in other words, map not only physical space but conceptual space. It appears that our representation of objects and concepts is very tightly linked with our representation of space.

Work spanning decades has found that regions in the brain—the hippocampus and entorhinal cortex—act like a GPS. Their cells form a grid-like representation of the brain’s surroundings and keep track of its location on it. Specifically, neurons in the entorhinal cortex activate at evenly distributed locations in space: If you drew lines between each location in the environment where these cells activate, you would end up sketching a triangular grid, or a hexagonal lattice. The activity of these aptly named “grid” cells contains information that another kind of cell uses to locate your body in a particular place. The explanation of how these “place” cells work was stunning enough to award scientists John O’Keefe, May-Britt Moser, and Edvard Moser, the 2014 Nobel Prize in Physiology or Medicine. These cells activate only when you are in one particular location in space, or the grid, represented by your grid cells. Meanwhile, head-direction cells define which direction your head is pointing. Yet other cells indicate when you’re at the border of your environment—a wall or cliff. Rodent models have elucidated the nature of the brain’s spatial grids, but, with functional magnetic resonance imaging, they have also been validated in humans.

Recent fMRI studies show that cognitive spaces reside in the hippocampal network—supporting the idea that these spaces lie at the heart of much subconscious processing. For example, subjects of a 2016 study—headed by neuroscientists at Oxford—were shown a video of a bird’s neck and legs morph in size. Previously they had learned to associate a particular bird shape with a Christmas symbol, such as Santa or a Gingerbread man. The researchers discovered the subjects made the connections with a “mental picture” that could not be described spatially, on a two-dimensional map. Yet grid-cell responses in the fMRI data resembled what one would see if subjects were imagining themselves walking in a physical environment. This kind of mental processing might also apply to how we think about our family and friends. We might picture them “on the basis of their height, humor, or income, coding them as tall or short, humorous or humorless, or more or less wealthy,” Doeller said. And, depending on whichever of these dimensions matters in the moment, the brain would store one friend mentally closer to, or farther from, another friend.

But the usefulness of a cognitive space isn’t just restricted to already familiar object comparisons. “One of the ways these cognitive spaces can benefit our behavior is when we encounter something we have never seen before,” Bellmund said. “Based on the features of the new object we can position it in our cognitive space. We can then use our old knowledge to infer how to behave in this novel situation.” Representing knowledge in this structured way allows us to make sense of how we should behave in new circumstances.

Data also suggests that this region may represent information with different levels of abstraction. If you imagine moving through the hippocampus, from the top of the head toward the chin, you will find many different groups of place cells that completely map the entire environment but with different degrees of magnification. Put another way, moving through the hippocampus is like zooming in and out on your phone’s map app. The area in space represented by a single place cell gets larger. Such size differences could be the basis for how humans are able to move between lower and higher levels of abstraction—from “dog” to “pet” to “sentient being,” for example. In this cognitive space, more zoomed-out place cells would represent a relatively broad category consisting of many types, while zoomed-in place cells would be more narrow.

Yet the mind is not just capable of conceptual abstraction but also flexibility—it can represent a wide range of concepts. To be able to do this, the regions of the brain involved need to be able to switch between concepts without any informational cross-contamination: It wouldn’t be ideal if our concept for bird, for example, were affected by our concept for car. Rodent studies have shown that when animals move from one environment to another—from a blue-walled cage to a black-walled experiment room, for example—place-cell firing is unrelated between the environments. Researchers looked at where cells were active in one environment and compared it to where they were active in the other. If a cell fired in the corner of the blue cage as well as the black room, there might be some cross-contamination between environments. The researchers didn’t see any such correlation in the place-cell activity. It appears that the hippocampus is able to represent two environments without confounding the two. This property of place cells could be useful for constructing cognitive spaces, where avoiding cross-contamination would be essential. “By connecting all these previous discoveries,” Bellmund said, “we came to the assumption that the brain stores a mental map, regardless of whether we are thinking about a real space or the space between dimensions of our thoughts.”

Reckoning With Violence

The crime debate is another example of how the ruling elite is disconnected from the American majority. Most Americans support rehabilitation, rather than punishment. This support is even stronger among victims of crimes because they understand how tough-on-crime policies have destroyed their communities and harmed the people they care about.

But the ruling elite make massive profits from the privatized prisons and, if nothing else, it is highly effective social control in keeping the population in a permanent state of anxiety and fear. The purpose was never to make the world a better place or to help the average American, much less those struggling near the bottom.

The system works perfectly for its intended purpose. The problem is its intended purpose is psychopathic and evil. And I might add, the ruling elite promoting it is bipartisan. It’s time that we the American people demand justice for our families and communities and refuse anything less from those who attempt to get in our way. Let’s save our righteous wrath for those most deserving of it.

* * *

Reckoning With Violence
by Michelle Alexander

As Ms. [Danielle] Sered explains in her book [Until We Reckon], drawing on her experience working with hundreds of survivors and perpetrators of violence in Brooklyn and the Bronx, imprisonment isn’t just an inadequate tool; it’s often enormously counterproductive — leaving survivors and their communities worse off.

Survivors themselves know this. That’s why fully 90 percent of survivors in New York City, when given the chance to choose whether they want the person who harmed them incarcerated or in a restorative justice process — one that offers support to survivors while empowering them to help decide how perpetrators of violence can repair the damage they’ve done — choose the latter and opt to use the services of Ms. Sered’s nonprofit organization, Common Justice. […]

Ninety percent is a stunning figure considering everything we’ve been led to believe that survivors actually want. For years, we’ve been told that victims of violence want nothing more than for the people who hurt them to be locked up and treated harshly. It is true that some survivors do want revenge or retribution, especially in the immediate aftermath of the crime. Ms. Sered is emphatic that rage is not pathological and a desire for revenge is not blameworthy; both are normal and can be important to the healing process, much as denial and anger are normal stages of grief.

But she also stresses that the number of people who are interested only in revenge or punishment is greatly exaggerated. After all, survivors are almost never offered real choices. Usually when we ask victims “Do you want incarceration?” what we’re really asking is “Do you want something or nothing?” And when any of us are hurt, and when our families and communities are hurting, we want something rather than nothing. In many oppressed communities, drug treatment, good schools, economic investment, job training, trauma and grief support are not available options. Restorative justice is not an option. The only thing on offer is prisons, prosecutors and police.

But what happens, Ms. Sered wondered, if instead of asking, “Do you want something or nothing?” we started asking “Do you want this intervention or that prison?” It turns out, when given a real choice, very few survivors choose prison as their preferred response.

This is not because survivors, as a group, are especially merciful. To the contrary, they’re pragmatic. They know the criminal justice system will almost certainly fail to deliver what they want and need most to overcome their pain and trauma. More than 95 percent of cases end in plea bargains negotiated by lawyers behind the scenes. Given the system’s design, survivors know the system cannot be trusted to validate their suffering, give them answers or even a meaningful opportunity to be heard. Nor can it be trusted to keep them or others safe.

In fact, many victims find that incarceration actually makes them feel less safe. They worry that others will be angry with them for reporting the crime and retaliate, or fear what will happen when the person eventually returns home. Many believe, for good reason, that incarceration will likely make the person worse, not better — a frightening prospect when they’re likely to encounter the person again when they’re back in the neighborhood. […]

A growing body of research strongly supports the anecdotal evidence that restorative justice programs increase the odds of safety, reduce recidivism and alleviate trauma. “Until We Reckon” cites studies showing that survivors report 80 to 90 percent rates of satisfaction with restorative processes, as compared to 30 percent for traditional court systems.

Common Justice’s success rate is high: Only 7 percent of responsible parties have been terminated from the program for a new crime. And it’s not alone in successfully applying restorative justice principles. Numerous organizations — such as Community Justice for Youth Institute and Project NIA in Chicago; the Insight Prison Project in San Quentin; the Community Conferencing Center in Baltimore; and Restorative Justice for Oakland Youth — are doing so in communities, schools, and criminal justice settings from coast-to-coast.

In 2016, the Alliance for Safety and Justice conducted the first national poll of crime survivors and the results are consistent with the emerging trend toward restorative justice. The majority said they “believe that time in prison makes people more likely to commit another crime rather than less likely.” Sixty-nine percent preferred holding people accountable through options beyond prison, such as mental health treatment, substance abuse treatment, rehabilitation, community supervision and public service. Survivors’ support for alternatives to incarceration was even higher than among the general public.

Survivors are right to question incarceration as a strategy for violence reduction. Violence is driven by shame, exposure to violence, isolation and an inability to meet one’s economic needs — all of which are core features of imprisonment. Perhaps most importantly, according to Ms. Sered, “Nearly everyone who has committed violence first survived it,” and studies indicate that experiencing violence is the greater predictor of committing it. Caging and isolating a person who’s already been damaged by violence is hardly a recipe for positive transformation.

The Court of Public Opinion: Part 1

This is about public opinion and public perception as it relates to public policy (see previous posts). I also include some analyses of the opinions of politicians as it relates to public opinion or rather their perception of what they think or want to believe about the public (for background, see here and here).

I’ll begin with a problematic example of a poll. Here is an article that someone offered as proving the public supports tough-on-crime policies:

There were stunning findings in a new poll released Monday on crime in New York City. Keeping crime down is way more important to voters than reforming the NYPD’s controversial stop-and-frisk program…

a new Quinnipiac University poll…reveals that public safety is uppermost on the minds of voters…

Asked which was more important, keeping crime rates down or reforming stop and frisk, 62 percent said keeping crime rates low and 30 percent said reforming stop and frisk.

The article itself isn’t important. There are thousands like it, but I wanted to use it for the polling data it was using.

I don’t know of any bias from Quinnipiac, beyond a basic mainstream bias, and so maybe the wording of the question was simply intellectual laziness. It was phrased as a forced choice question that implied choosing one negated the possibility of the other and it implied those were the only choices for public policy.

I looked further into data related to stop and frisk. It isn’t as simple as the forced choice presents it. For one, a number of studies don’t show that stop and frisk actually keeps crime rates low, as the question assumes. Secondly, when given more information and more options, Americans tend to support funding programs that either help prevent crime or help rehabilitate criminals.

The general public will favor punishment, when no other good choices are offered them. Still, that doesn’t say much about the fundamental values of most Americans. I’m not just interested in the answers given, but also the questions asked, how they are framed and how they are worded.

The Court of Public Opinion: Part 2

I’ll highlight one issue. It is a chicken or the egg scenario.

The political elites are fairly clueless about the views of the general public, including their own constituents. At the same time, the average American is clueless about what those in government are actually doing. This disconnection is what one expects from a society built on a class-based hierarchy with growing extremes of inequality. In countries that have lower inequality, there is far less disconnection between political elites and the citizenry.

It isn’t clear who is leading who. How could politicians simply be doing what the public wants when they don’t know what the public wants? So, what impact does public opinion even have? There is strong evidence that public opinion might simply be following elite opinion and reacting to the rhetoric heard on the MSM.

Populations are easily manipulated by propaganda, as history shows. That seems to be the case with the United States as well.

As such, it isn’t clear how punitive most Americans actually are. When given more and better information, when given more and better options, most Americans tend to focus away from straightforward punitive policies. Imagine what the public might support if we ever had an open and honest debate based on the facts.

Specific Language Impairment

My parents recently heard J. Bruce Tomblin give a talk. He is a professor emeritus at the University of Iowa, in the Department of Communication Sciences and Disorders. He was discussing specific language impairment.

I’ve previously suspected that my own language difficulties might be related to autism spectrum disorders [ASD], but specific language impairment [SLI] might be better fit. As for symptoms, there is some overlap between ASD and SLI. A big difference is that SLI has no specific direct causes as to be found in IQ differences, brain damage, etc. SLI kids otherwise appear normal, although as with ASD there can be difficulties with socializing. Language is kind of important for developing and maintaining relationships.

This is the first time I’ve heard of SLI and, despite being far more common (5-8% of kids) than ASD or dyslexia (less than 1% in both cases), it is less well known. An interesting explanation for this is that parents of SLI children also tend to have language difficulties which impacts interpersonal skills and so they there are fewer people to effectively advocate for SLI. The sad consequence is relatively little money has gone into funding SLI research and so SLI kids rarely get the help they need. It’s a non-flashy condition and so easily ignored. No one makes movies, tv shows, and documentaries about those struggling with SLI.

By the way, the UI Department of Communication Sciences and Disorders had its origins in stuttering. One of the founders was Dean Williams. I had some stuttering as a child, as did my brother, although for different reasons. Mine was related to language and my brother’s to anxiety. I bring this up because my brother worked directly with Dean Williams when he was younger, as my mother knew of him in her own career as a speech pathologist. Stuttering is sometimes seen in SLI and so it is fitting that Tomblin is in that department. Here is something I wrote about this in connection to Standard American English:

“It’s interesting to note that many of the earliest speech centers and speech corrections/therapy schools in the US were in the Midwest, where many of the pioneers (e.g., Charles Van Riper) in the field came from—such as Michigan, Wisconsin, and Iowa. Right here in the town I live in, Iowa City, was one of the most influential programs and one of the main professors in that program was born in Iowa City, Dean Williams. As my mother audited one of Williams’ classes, she got to know him and he worked with my brother’s stuttering. Interestingly, Williams himself came in contact with the field because of his own childhood stuttering, when Wendell Johnson helped him. My mother heard Williams say that, while he was in the military during WWII, Johnson sent him speech journals as reading material which inspired him to enter the field when he returned after the war.”

I don’t know that Tomblin would have personally known or professionally worked with Williams. But surely Williams’ legacy would have been felt.

* * *

Specific language impairment
from Wikipedia

Specific language impairment (SLI) is diagnosed when a child’s language does not develop normally and the difficulties cannot be accounted for by generally slow development, physical abnormality of the speech apparatus, autism spectrum disorder, apraxia, acquired brain damage or hearing loss. […]

Specific language impairment (SLI) is diagnosed when a child has delayed or disordered language development for no apparent reason.[3] Usually the first indication of SLI is that the child is later than usual in starting to speak and subsequently is delayed in putting words together to form sentences. Spoken language may be immature. In many children with SLI, understanding of language, or receptive language, is also impaired, though this may not be obvious unless the child is given a formal assessment.[4]

Although difficulties with use and understanding of complex sentences are a common feature of SLI, the diagnostic criteria encompass a wide range of problems, and for some children other aspects of language are problematic (see below). In general, the term SLI is reserved for children whose language difficulties persist into school age, and so it would not be applied to toddlers who are late to start talking, most of whom catch up with their peer group after a late start.[5]

ADHD, Auditory Processing Disorder or Specific Language Impairment?
by Devon Barnes

SLI – Specific Language Impairment

Children with SLI are delayed and disordered in the development of their understanding (Receptive Language) and use of language (Expressive Language).

SLI is not due to cognitive deficits, sensory impairments or neurological problems.

Students with SLI can present with:

  • A history of delayed language development
  • Poor comprehension
  • Limited use of vocabulary
  • Poor grammar
  • Difficulty with sentence formulation
  • Word-finding difficulties
  • Poor pragmatic skills.
  • Children with language impairments can have excellent decoding skills but struggle with reading comprehension and so these difficulties can be missed in the early school years because they appear to be “good readers”.

In later years language weaknesses can negatively impact the student’s ability to cope with the learning demands of the curriculum as language underpins all learning.

They will have particular difficulty understanding grade related texts and with written language tasks such as assignments and essays.

Top 10 Things you should know . . .
about children with Specific Language Impairment

from The Merrill Advanced Studies Center, The University of Kansas

  1. Specific Language Impairment has many names and it is surprisingly common.
    SLI is just one of the many communication disorders that affect more than 1 million students in the public schools. If your child has been evaluated by a speech pathologist, you may have heard its other names: developmental language disorder, language delay or developmental dysphasia. Specific language impairment is the precise name that opens the door to research about how to help a child grow and learn.
    SLI is more common than you might think. Research over the past ten years has generated accurate estimates of the numbers of young children that are affected by SLI. We now know it could be as high as 7 to 8 percent of the children in kindergarten. In comparison, Down syndrome or autism affects less than one percent of the five-year olds.
  2. Late talking may be a sign of disability.
    As they enter their two’s and grow into three and four, children have a remarkable number of ways to tell adults what they need. Even if the words don’t all sound right, a normally developing child will make many efforts to communicate and will make his point effectively. Young children ask so many questions — often exhausting their parents and care providers. Children who don’t ask questions or tell adults what they want may have a communication disorder.​
    Children with SLI may not produce any words until they are nearly two years old. At age three, they may talk, but can’t be understood. As they grow, they will struggle to learn new words, make conversation and sound coherent. Today, research is underway to determine which children do not outgrow this pattern of delayed speech. By age 4 to 5 years, SLI could be a signpost of a lasting disability that persists throughout the school years.​
  3. A child with SLI does not have a low IQ or poor hearing.
    Several other disabilities involve difficulties communicating, but for these children the primary diagnosis will be mental retardation, or autism, or hearing loss, or cerebral palsy. A child with SLI scores within the normal range for nonverbal intelligence. Hearing loss is not present. Emerging motor skills, social-emotional development and the child’s neurological profile are all normal. The only setback is with language. SLI is the primary diagnosis.
  4. Speech impediments are different from language disorders.​
    A child with a speech disorder makes errors in pronouncing words, or may stutter. Recent studies find that most children with SLI do not have a speech disorder. SLI is a language disorder. This means that the child has difficulty understanding and using words in sentences. Both receptive and expressive skills are typically affected.
  5. An incomplete understanding of verbs is an indicator of SLI.
    Five-year old children with SLI sound about two years younger than they are. Listen to the way a child uses verbs. Typical errors include dropping the -s off present tense verbs and asking questions without the usual “be” or “do” verbs. For example, instead of saying “She rides the horse” the child will say “She ride the horse.” Instead of saying “Does he like me?” the child will ask “He like me?” Children with SLI also have trouble communicating that an action is complete because they drop the past tense ending from verbs. They say, “She walk to my house yesterday” instead of “she walked to my house.”
  6. Reading and learning will be affected by SLI.
    SLI does affect a child’s academic success, especially if left untreated. Forty to seventy-five percent of the children have problems learning to read.

How Children With Specific Language Impairment View Social Situations: An Eye Tracking Study
by Mariko Hosozawa, Kyoko Tanaka, Toshiaki Shimizu, Tamami Nakano, and Shigeru Kitazawa

RESULTS: The SLI [Specific Language Impairment] and TD [Typically Developing] groups each formed a cluster near the center of the multidimensional scaling plane, whereas the ASD [Autism Spectrum Disorders] group was distributed around the periphery. Frame-by-frame analyses showed that children with SLI and TD children viewed faces in a manner consistent with the story line, but children with ASD devoted less attention to faces and social interactions. During speech scenes, children with SLI were significantly more fixated on the mouth, whereas TD children viewed the eyes and the mouth.

CONCLUSIONS: Children with SLI viewed social situations in ways similar to those of TD children but different from those of children with ASD. However, children with SLI concentrated on the speaker’s mouth, possibly to compensate for audiovisual processing deficits. Because eyes carry important information, this difference may influence the social development of children with SLI.

Autism’s hidden older brother – Specific Language Impairment
by Andrew Whitehouse

Still, SLI remains very much a “hidden disability” within the community – poorly understood and rarely discussed.

One reason for the lack of awareness about SLI is the very nature of the condition. Communication difficulties silence the biggest weapon for penetrating the public consciousness – advocacy. Add to this the fact that relatives of children with SLI often have communication difficulties themselves and the advocacy problem is exacerbated.

A second reason is SLI’s younger but more muscular brothers: developmental dyslexia and autism. The earliest descriptions of SLI date to the early 19th century, well before the first descriptions of developmental dyslexia (1887) and autism (1943). But, the intrigue of the symptoms associated with these latter conditions, in addition to the strong and numerous advocacy groups that support them, have facilitated increased research and greater public awareness.

A recent analysis of data from the National Institute of Health revealed that autism receives over 30 times more research funding than SLI, despite affecting five times fewer people.

Promising research areas

Perhaps the most promising area of research for SLI investigates the “specialization” of the brain’s two hemispheres. In the majority of people, the most crucial areas involved in language production are found in the left hemisphere. Brain imaging studies have found that children and adults with SLI are more likely to have these language areas in the right hemisphere.

But like the genetic investigation of SLI, considerably more research is needed to really understand the neurological differences that underpin this condition.

Another area of particular interest is short-term memory. A series of experiments in the early 1990s found that children with SLI have considerable difficulty in accurately repeating nonsense words, such as perplisteronk and scriflunaflisstrop.

The inability to memorise previously unheard “words” and repeat them accurately is one possible reason why children with SLI have difficulty growing their vocabulary and stringing words together in complex grammatical structures.

Long-term impacts

Language development is highly variable and many children have early difficulties that resolve by the time they enter school. But when language difficulties persist into the school years – as in the case of children with SLI – there are often considerable longer-term effects.

Children with SLI are less likely to complete secondary school, and are more likely to experience long periods of unemployment during adulthood. What’s more, individuals with SLI have greater difficulties forming close friendships and romantic relationships.

The impact on mental health is significant, and adults with SLI are at a disturbingly high risk (around 50%) for depressive and anxiety disorders.

Low-Carb Diets On The Rise

I’ve been paying close attention to diet this past year. It’s something I’ve had some focus on for decades now, but new info has recently changed the public debate going on. For example, a few years back, the research data from Ancel Keys was reanalyzed and an entirely different conclusion was found to be more plausible — instead of blaming saturated fat, the stronger correlation was to sugar. So much of what mainstream dietitians and nutritionists asserted as fact was based on Keys’ work, but it has since come under a dark cloud of doubt. Simply put, it was horrible science and even worse public health policy.

My own recent interest, though, was piqued in watching the documentary The Magic Pill. It came out in 2017 and several other great documentaries have come out in the last few years, with Nina Teicholz’s documentary in the works. In playing around with diet in the broad sense, I didn’t find much that helped, beyond limiting added sugar and throwing in a few healthy traditional foods (e.g., cultured dairy). It’s not that I ever was much interested in formal diets — some combination of laziness, apathy, and being too independent-minded, hence figuring something out for myself or else failing on my own terms, no doubt plenty of failure was involved and long periods of depressive despair and frustration. I’ve always been more about experimenting and finding what works or doesn’t work for me, if for no other reason than being stubborn in going my own way.

The problem was that nothing fundamentally had worked for my depression that plagued me my whole life nor for the weight gain that hit me as I approached my 40s. It is damn hard struggling to be healthy while depressed, but I did try such things as exercising regularly for it had some immediate palpable effect. Still, it was strange to exercise and yet not lose weight, even if aerobics did lift my mood ever so slightly. I was literally running to stay in place.

That is where The Magic Pill came in. I randomly came across it and watched it out of passing curiosity. Something about the case made was compelling to me, a blend of science and personal experience that rang true to my decades of reading and experimentation. It brought many pieces together: the whole foods emphasis on quality, the vegetarian emphasis on plant foods, the traditional food emphasis on nutrient-density, the low-carb emphasis on avoiding grains, legumes and sugar, the ketogenic emphasis on shifting metabolism, mood and much else, the alternative health emphasis on eliminating processed foods and additives, and the holistic/functional medicine emphasis on seeing the body as a system and part of larger systems.

So, what miraculous diet brings all of this diversity of views together under the umbrella of a coherent understanding? It’s the paleo diet, although some prefer to call it a lifestyle or a philosophy as it isn’t a singular dietary regimen or protocol. It’s about learning how to be healthy by following the examples of traditional societies in combination with the best science available, not only research in diet and nutrition as narrow fields but also research from dentistry, anthropology, archaeology, etc — any and all info that helps us understand the evolution of human health, specifically in explaining what has gone so terribly wrong in industrialized societies with the diseases of civilization. Diet is important, but only one part. Through an alliance with functional medicine, there is a greater focus on what makes for a healthy lifestyle: exercise, stress reduction, toxicity elimination, forest bathing, sun exposure, learning new things, etc… and don’t forget about play, something lost to so many modern adults.

Despite that greater focus of concern, it is the dietary angle that draws people in. Simply put, a lot of people feel better on the paleo diet, often in healing numerous conditions or at least reversing some of the worst symptoms, from conditions like obesity and diabetes to autism and depression to Alzheimer’s and multiple sclerosis, and much else. The paleo diet, as with traditional foods (both inspired by the work of Weston A. Price), is a good introduction to an alternative way of thinking not only about diet but health in general. It seems to be a gateway diet for many who go on to try related diets: primal (paleo plus dairy), Whole30, ketogenic, ketotarian, pegan, pescatarian, carnivore, etc. Primal, as one common example, demonstrates how paleolists have a tendency of drifting toward the similar traditional foods. Paleo is more of a framework than anything else, to the extent that it requires or promotes a paradigm change in one’s attitude.

The greater issue at hand is a potential paradigm change of society. That is the battle going on right now, those promoting that shift and those defending the status quo. Most figures and institutions of authority attack diets like paleo and keto because they are threatening. And the reason they are threatening is because of their growing popularity which in turn comes from their being highly effective for their intended purposes, while also being followed and sometimes promoted by many famous people, from media figures to politicians, including plenty of athletes (according to various sources, and in no particular order):

Bill Clinton, Madonna, Drew Carey, Renee Zellweger, Katie Couric, Al Roker, Halle Berry, Kim Kardashian, Kourtney Kardashian, Vinny Guadagnino, Jordan Peterson, Vanessa Hudgens, Megan Fox, Adriana Lima, Jessica Biel, Blake Lively, Channing Tatum, Eva La Rue, Phil Mickelson, Aisha Tyler, Matthew McConaughey, Edgar Ramirez, Jeb Bush, Kanye West, Christina Aguilera, Jack Osbourne, Kelly Osbourne, Sharon Osbourne, Miley Cyrus, Ursula Grobler, Becca Borawski, Aaron Rodgers, Andrew Flintoff, Jenna Jameson, Savannah Guthrie, Chris Scott, Tamra Judge, Grant Hill, Uma Thurman, Kobe Bryant, Gwyneth Paltrow, LeBron James, Alicia Vikander, Tim McGraw, Kristin Cavallari, Tom Jones, Grant Hill, Mick Jagger, Melissa McCarthy, Jennifer Lopez, Robin Wright, Cindy Crawford, Jennifer Aniston, Guy Sebastian, Elle Macpherson, Courteney Cox, Catherine Zeta Jones, Geri Halliwell, Ben Affleck, Joe Rogan, Brendan Schaub, Shane Watson, Tim Ferris, Jessica Simpson, Rosie O’Donnell, Lindsey Vonn, Alyssa Milano, Kendra Wilkinson, Christina Aguilera, Britney Spears, Joe Manganiello, Tom Kerridge, Jessica Alba, Mariah Carey, Tobey McGuire, Jennifer Hudson, Shania Twain, etc.

These low-carb diets work. People feel better, lose weight, go off their meds, have a lot of energy, and on and on. It’s a paradigm change with a real kick and so the change is largely coming from below, from probably hundreds of thousands of individuals experimenting similar to what I’ve done, including individual doctors who decide to buck the system and sometimes are punished for it (a few key examples are: John Yudkin, Tim Noakes, and Gary Fettke). And every individual this works for ends up being an inspiration to numerous others, even if only to the people they personally know such as family members, friends, neighbors, and coworkers. Other people see it works and so they try it themselves. This is how it went from a minor diet to its present growing momentum and did so in a fairly short period of time.

As I was saying at the beginning of this piece, I’ve been observing this shift. And I’ve come to realize it might be a seismic change going on. Every now and then, I see hints of the impact in the world around me. These alternative views are taking hold and won’t remain alternative for long. They are forcing their way into mainstream awareness. Unsurprisingly, there is backlash.

There is the corporate media, of course, with their typical attack pieces on “fad diets”, ignoring the fact that the keto diet has been medically researched since the early 1900s, the low-carb diet having been the first popular diet starting back in the 1800s, the traditional foods diet based on thousands of years of shared human experience, and the paleo diet as the diet hominids have thrived on for millions of years. The corporate media prefers to ignore what is threatening, until the point it no longer can be ignored, and so we are in that second phase right now, maybe a bit beyond since the mainstream authorities have already adopted some of the alternative views without acknowledging it (e.g., AHA quietly lowering its recommendations of carb intake after pushing a high-carb diet for a half century, as if hoping no one would notice this implicit admission to having been wrong, and wrong in a way that harmed so many). Local media is sometimes more open to new views, though.

The whole EAT-Lancet issue demonstrates the sense of conflict in the air. The authors of the report frame the situation as a crisis for all of humanity and the earth. And they use that as a cudgel to bash the new low-carb challengers, to nip them in the bud, even to the extreme of pushing for international regulations that would force conformity with the high-carb approach of conventional diets that have risen to prominence these past decades — mainstream versions of: vegetarianism, veganism, and Mediterranean (the modern Mediterranean diet as studied after World War II, not the traditional one with high levels of animal foods that existed for millennia before 20th century industrialization of the food system, no noodles or tomatoes prior to modern colonial trade, and surprisingly not much if any olive oil since according to ancient texts it was mainly used for lamp fuel, with animal fat being preferred for cooking). We’ve seen this push with such things as “Veganuary”.

It has become an overtly ideological fight, but maybe it always was. The politicization of diet goes back to the early formalized food laws that became widespread in the Axial Age and regained centrality in the Middle Ages, which for Europeans meant a revival of ancient Greek thought, specifically that of Galen. And it is utterly fascinating that pre-scientific Galenic dietary philosophy has since taken on scientific garb and gets peddled to this day, as a main current in conventional dietary thought (see Food and Faith in Christian Culture ed. by Ken Albala and Trudy Eden with an excerpt to be read here; I made this connection in realizing that Stephen Le, a biological anthropologist, was without awareness parroting Galenic thought in his book 100 Million Years of Food).

But the top-down approach to pushing dietary regimens hasn’t been all that successful in more recent years, maybe because of growing cynicism about past failures. Even with it being heavily promoted by well-funded organizations and government agencies, the high-carb plant-based diets are beginning to find it hard to maintain their footing in the tides of change. According to various data, it’s easy to get people to try veganism for a short period, but few maintain it. Vegetarianism is less restrictive, of course, but consistent adherence is still rare. The vast majority who start veganism or vegetarianism either occasionally eat meat or fish or else eventually give up on the diet. There is big money, including corporate money, behind the campaigns promoting it (most processed foods, including junk food, are technically vegan and big food has come to realize this is an effective way of marketing unhealthy food as healthy). Still, it doesn’t seem to be catching on with the general public, not that I doubt there will be those who continue their games of propaganda, persuasion, and perception management.

People have gotten the message that a plant-based diet is good. That part of the official messaging machine has been successful. Indeed, for decades, most Americans have been increasing their intake of fruits and vegetables and that is a good thing, but as far as that goes the paleo diet and many related diets also tend to recommend high levels of fruits and vegetables. The main advantage the low-carb diets have is that it’s easier to give up bread than to give up all animal foods (including eggs and dairy), though vegetarianism is a decent compromise since it allows some animal foods and that increases availability of the key fat-soluble vitamins. It’s not that low-carb, keto, or paleo vegetarianism is hard to do — so it isn’t an either/or scenario, but many pushing a so-called “plant-based” diet for some reason want to portray it in such dualistic terms, maybe as a way of falsely portraying low-carb as an anti-plant caricature in order to make it seem ridiculous and extremist.

Despite the ideological reaction, there is the growing realization that maybe there is some profit to be had in this emerging trend, as most businesses ultimately don’t care about dietary ideology and will go where the wind blows. New products cater to these alternative diets (paleo creamer, keto supplements, etc) or else old products are repackaged (“Keto Friendly!”). This is why it gets called a “fad diet”. But if being heavily marketed makes a diet a fad, then the same label applies to conventional diets as well that are more heavily marketed than any alternative diet. I’ve also begun seeing paleo and keto magazines, guides, and recipe booklets in grocery stores. Even when dismissed by experts such as in rankings of recommended diets, these “fad diets” nonetheless get mentioned, albeit usually tossed to the bottom of the list. As all this demonstrates, we are long past the silent treatment.

Furthermore, it goes beyond the products specifically marketed as paleo or keto or whatever. Demand has been increasing for organ meats, coconut products (from coconut milk to coconut oil), cauliflower, etc; consumption of eggs is likewise on the rise — all favorites on the paleo diet, in particular, but also favorites for similar diets. Prices have been going up on these items and, because demand sometimes exceeds supply, they can go out of stock at stores. Why are they so sought after? Organ meats are nutrient-dense, coconut milk is a good replacement for dairy and coconut oil for unhealthy vegetable/seed oils, and cauliflower can be used as a replacement for rice, mashed potatoes, tater tots and pizza crust (“The weird thing about cauliflower, though, is that while it has allies, it doesn’t really have adversaries.” ~Rachel Sugar); as for eggs, their popularity needs no explanation now that the cholesterol and saturated fat myths are evaporating.

Even Oprah Winfrey, though financially invested in the conventional Weight Watchers diet (in owning 8% of the company) and a self-declared lover of bread (actual quote: “I love bread!”), has put out a line of products that includes a low-carb pizza with cauliflower crust. This is interesting since, as low-carb diets have gained popularity, the stock of Weight Watchers has plunged 60% and Oprah lost at least 58 million dollars in one night and a loss of 500 million over all, putting Oprah’s star power to a serious test — maybe Oprah decided it is wise to not put all her eggs in one basket, in case Weight Watchers totally tanks. The company is finding it difficult to gain and retain subscribers. Those profiting from established dietary ideology are feeling the pinch.

It’s amusing how Weight Watchers CEO Cindy Grossman responded to the low-carb threat: “We have a keto surge,” she said. “It’s a meme, it’s not like a company, it’s people have keto donuts, and everybody on the diet side look for the quick fix. We’ve been through this before, and we know that we are the program that works.” And that, “We’ve lived through this [competition from fad diets] for 57 years and we’re not going to play a game and we never have.” Good luck with that! Maybe in reassuring stockholders, she also stated that, “We’re going to be science informed and we’re sustainable for the long term.” That is great. Everyone should be science informed. The problem for those trying to hold onto old views is that the science has changed and so has the public’s knowledge of that science.

Most people these days aren’t looking for complicated diets with eating plans and paid services, much less pre-prepared meals to be bought. A subscription model is becoming less appealing, as so much info and other resources are now available online. Besides, the DYI approach (Do It Yourself) is preferred these days. Diets like paleo and keto are simple and straightforward, and they can be easily modified for individual needs or affordability. But even for those looking for a ready-made system like Weight Watchers, there are other options out there that are looking attractive: “Wall Street is clearly nervous, too. JPMorgan analyst Christina Brathwaite downgraded the [Weight Watchers] stock to “underperform” last week and slashed her price target. One of the reasons? She was worried about competition from rival weight-loss service Diet Doctor, which is a proponent of keto.”

In whatever form, like it or not, low-carb diets are on the rise. Even among vegans and vegetarianism, the low-carb approach will probably become more common. Maybe that is why we’ve suddenly seen new low-carb, plant-based diets like Dena Harris’ paleo vegetarianism (2015), Will Coles’s ketotarianism (2018), and Mark Hyman’s peganism (2018). Do a web search about any of this and you’ll find numerous vegans and vegetarians asking about, discussing, or else praising low-carb diets. The same is true in how one sees broad interest in thousands of websites, blogs, and articles. Hundreds upon hundreds of organizations, discussion forums, Reddit groups, Facebook groups, Twitter alliances, etc have sprouted up like mushrooms. More and more are jumping on the low-carb bandwagon, as apparently that is what a large and growing part of the public is demanding. Whether or not it ever was a fad, it is now a movement and it isn’t slowing down.

Fasting, Calorie Restriction, and Ketosis

What we eat obviously affects gut health such as the microbiome and through that, along with other mechanisms, it affects the rest of our body, the brain included (by way of permeability, immune system, vagus nerve, substances like glutamate and propionate, and much else). About general health, I might add that foods eaten in what combination (e.g., red meat and grains) is also an issue. Opposite of what you eat impacting neurocognition and mental health, not eating (i.e., fasting, whether intermittent or extended) or else caloric restriction and carbohydrate reduction, ketogenic or otherwise, alters it in other ways.

Fasting, for example, increases the level of neurotransmitters such as serotonin, dopamine, and norepinephrine while temporarily reducing the brains release and use of them; plus, serotonin and its precursor tryptophan are made more available to the brain. So, it allows your reserves of neurotransmitters to rebuild to higher levels. That is partly why a ketogenic diet, along with the brains efficient use of ketones, shows improvements in behavior, learning, memory, acuity, focus, vigilance, and mood (such as sense of well-being and sometimes euphoria); with specific benefits, to take a couple of examples, in cerebral blood flow and prefrontal-cortex-related cognitive functions (mental flexibility and set shifting); while also promoting stress resistance, inflammation reduction, weight loss, and metabolism, and while decreasing free radical damage, blood pressure, heart rate, and glucose levels. Many of these are similar benefits as seen with strenuous exercise.

We know so much about this because the ketogenic diet is the only diet that has been specifically and primarily studied in terms of neurological diseases, going back to early 20th century research on epileptic seizures and autism, was shown effective for other conditions later in the century (e.g., V. A. Angelillo et al, Effects of low and high carbohydrate feedings in ambulatory patients with chronic obstructive pulmonary disease and chronic hypercapnia), and more recently with positive results seen in numerous other conditions (Dr. Terry Wahl’s work on multiple sclerosis, Dr. Dale Bredesen’s work on Alzheimer’s, etc). By the way, the direction of causality can also go the other way around, from brain to gut: “Studies also suggest that overwhelming systemic stress and inflammation—such as that induced via severe burn injury—can also produce characteristic acute changes in the gut microbiota within just one day of the sustained insult [15].” (Rasnik K. Singh et al, Influence of diet on the gut microbiome and implications for human health). And see:

“Various afferent or efferent pathways are involved in the MGB axis. Antibiotics, environmental and infectious agents, intestinal neurotransmitters/neuromodulators, sensory vagal fibers, cytokines, essential metabolites, all convey information about the intestinal state to the CNS. Conversely, the HPA axis, the CNS regulatory areas of satiety and neuropeptides released from sensory nerve fibers affect the gut microbiota composition directly or through nutrient availability. Such interactions appear to influence the pathogenesis of a number of disorders in which inflammation is implicated such as mood disorder, autism-spectrum disorders (ASDs), attention-deficit hypersensitivity disorder (ADHD), multiple sclerosis (MS) and obesity.” (Anastasia I. Petra et al, Gut-Microbiota-Brain Axis and Its Effect on Neuropsychiatric Disorders With Suspected Immune Dysregulation)

There are many other positive effects. Fasting reduces the risk of neurocognitive diseases: Parkinson’s, Alzheimer’s, etc. And it increases the protein BDNF (brain-derived neurotrophic factor) that helps grow neuronal connections. Results include increased growth of nerve cells from stem cells (as stem cells are brought out of their dormant state) and increased number of mitochondria in cells (mitochondria are the energy factories), the former related to the ability of neurons to develop and maintain connections between each other. An extended fast will result in autophagy (cellular housekeeping), the complete replacement of your immune cells and clearing out damaged cells which improves the functioning of your entire body (it used to be thought to not to occur in the brain but we now know it does) — all interventions known to prolong youthful health, lessen and delay diseases of aging (diabetes, cancer, cardiovascular disease, etc), and extend lifespan in lab animals involve autophagy (James H. Catterson et al, Short-Term, Intermittent Fasting Induces Long-Lasting Gut Health and TOR-Independent Lifespan Extension). Even calorie restriction has no effect when autophagy is blocked (Fight Aging!, Autophagy Required For Calorie Restriction Benefits?). It cleans out the system, gives the body a rest from its normal functioning, and redirects energy toward healing and rebuilding.

As a non-human example, consider hibernation for bears. A study was done comparing bears with a natural diet (fruits, nuts, insects, and small mammals) and those that ate human garbage (i.e., high-carb processed foods). “A research team tracked 30 black bears near Durango, Colo., between 2011 and 2015, paying close attention to their eating and hibernation habits. The researchers found that bears who foraged on human food hibernated less during the winters — sometimes, by as much as 50 days — than bears who ate a natural diet. The researchers aren’t sure why human food is causing bears to spend less time in their dens. But they say shorter hibernation periods are accelerating bears’ rates of cellular aging” (Megan Schmidt, Human Food Might Be Making Bears Age Faster). As with humans who don’t follow fasting or a ketogenic diet, bears who hibernate less don’t live as long. Maybe a high-carb diet messes with hibernation similarly to how it messes with ketosis.

Even intermittent fasting shows many of these benefits. Of course, you can do dramatic changes to the body without fasting at all, if you’re on a ketogenic diet (though one could call it a carb fast since it is extremely low carb) or severe caloric restriction (by the way, caloric restriction has been an area of much mixed results and hence confusion — see two pieces by Peter Attia: Calorie restriction: Part I – an introduction & Part IIA – monkey studies; does intermittent fasting and ketosis mimic caloric restriction or the other way around?). I’d add a caveat: On any form of dietary limitation or strict regimen, results vary depending on specifics of test subjects and other factors: how restricted and for how long, micronutrient and macronutrient content of diet, fat-adaptation and metabolic flexibility, etc; humans, by the way, are designed for food variety and so it is hard to know the consequences of modern diet that often remains unchanged, season to season, year to year (Rachel Feltman, The Gut’s Microbiome Changes Rapidly with Diet). There is a vast difference between someone on a high-carb diet doing an occasional fast and someone on a ketogenic diet doing regular intermittent fasting. Even within a single factor such as a high-carb diet, there is little similarity between the average American eating processed foods and a vegetarian monk eating restricted calories. As another example, autophagy can take several days of fasting to be fully achieved; but how quickly this happens depends on the starting conditions such as how many carbs eaten beforehand and how much glucose in the blood and glycogen stores in the muscles, both of which need to be used up before ketosis begins.

Metabolic flexibility, closely related to fat-adaptation, requires flexibility of the microbiome. Research has found that certain hunter-gatherers have microbiomes that completely switch from season to season and so the gut somehow manages to maintain some kind of memory of previous states of microbial balance which allows them to be re-established as needed. This is seen more dramatically with the Inuit who eat an extremely low-carb diet, but they seasonally eat relatively larger amounts of plant matter such as seaweed and they temporarily have digestive issues until the needed microbes take hold again. Are these microbes dormant in the system or systematically reintroduced? In either case, the process is unknown, as far as I know. What we are clear about is how dramatically diet affects the microbiome, whatever the precise mechanisms.

For example, a ketogenic diet modulates the levels of the microbes Akkermansia muciniphila, Lactobacillus, and Desulfovibrio (Lucille M. Yanckello, Diet Alters Gut Microbiome and Improves Brain Functions). It is the microbes that mediate the influence on both epileptic seizures and autism, such that Akkermansia is decreased in the former and increased in the latter, that is to say the ketogenic diet helps the gut regain balance no matter which direction the imabalance is. In the case of epileptic seizures, Akkermansia spurs the growth of Parabacteroides which alters neurotransmission by elevating the GABA/glutamate ratio (there is glutamate again): “the hippocampus of the microbe-protected mice had increased levels of the neurotransmitter GABA, which silences neurons, relative to glutamate, which activates them” (Carolyn Beans, Mouse microbiome findings offer insights into why a high-fat, low-carb diet helps epileptic children), but no such effect was found in germ-free mice, that is to say with no microbiome (similar results were found in human studies: Y. Zhang, Altered gut microbiome composition in children with refractory epilepsy after ketogenic diet). Besides reducing seizures, “GABA is a neurotransmitter that calms the body. Higher GABA to glutamate ratios has been shown to alleviate depression, reduce anxiety levels, lessen insomnia, reduce the severity of PMS symptoms, increase growth hormone, improve focus, and reduce systemic inflammation” (MTHFR Support, Can Eating A Ketogenic Diet Change Our Microbiome?). To throw out the other interesting mechanism, consider Desulfovibrio. Ketosis reduces its numbers and that is a good thing since it causes leakiness of the gut barrier, and what causes leakiness in one part of the body can cause it elsewhere as well such as the brain barrier. Autoimmune responses and inflammation can follow. This is why ketosis has been found beneficial for preventing and treating neurodegenerative conditions like Alzheimer’s (plus, ketones are a useful alternative fuel for Alzheimer’s since their brain cells begin starving to death for loss of the capacity to use glucose as a fuel).

All of this involves the factors that increase and reduce inflammation: “KD also increased the relative abundance of putatively beneficial gut microbiota (Akkermansia muciniphila and Lactobacillus), and reduced that of putatively pro-inflammatory taxa (Desulfovibrio and Turicibacter).” (David Ma et al, Ketogenic diet enhances neurovascular function with altered gut microbiome in young healthy mice). Besides the microbiome itself, this has immense impact on leakiness and autoimmune conditions, with this allowing inflammation to show up in numerous areas of the body, including the brain of course. Inflammation is found in conditions such as depression and schizophrenia. Even without knowing this mechanism, much earlier research has long established that ketosis reduces inflammation.

It’s hard to know what this means, though. Hunter-gatherers tend to have much more diverse microbiomes, as compared to industrialized people. Yet the ketogenic diet that helps induce microbial balance simultaneously reduces diversity. So, diversity isn’t always a good thing, with another example being small intestinal bacterial overgrowth (SIBO). What matters is which microbes one has in abundance and in relation which microbes one lacks or has limitedly. And what determines that isn’t limited to diet in the simple sense of what foods we eat or don’t eat but the whole pattern involved. Also, keep in mind that in a society like ours most of the population is in varying states of gut dysbiosis. First eliminating the harmful microbes is most important before the body can heal and rebalance. That is indicated by a study on multiple sclerosis that found, after the subjects had an initial reduction in the microbiome, “They started to recover at week 12 and exceeded significantly the baseline values after 23–24 weeks on the ketogenic diet” (Alexander Swidsinski et al, Reduced Mass and Diversity of the Colonic Microbiome in Patients with Multiple Sclerosis and Their Improvement with Ketogenic Diet). As always, it’s complex. But the body knows what to do when you give it the tools its evolutionarily-adapted to.

In any case, all of the methods described can show a wide range of benefits and improvements in physical and mental health. They are potentially recommended for almost anyone who is in a healthy state or in some cases of disease, although as always seek medical advice before beginning any major dietary change, especially anyone with an eating disorder or malnourishment (admittedly, almost all people on a modern industrialized diet are to some degree malnourished, especially Americans, although most not to a degree of being immediately life-threatening). Proceed with caution. But you are free to take your life in your hands by taking responsibility for your own health through experimentation in finding out what happens (my preferred methodology), in which case the best case scenario is that you might gain benefit at no professional medical cost and the worst case scenario is that you might die (not that I’ve heard of anyone dying from a typical version of a diet involving fasting, ketosis, and such; you’re way more likely to die from the standard American diet; but individual health conditions aren’t necessarily predictable based on the experience of others, even the vast majority of others). Still, you’re going to die eventually, no matter what you do. I wish you well, until that time.

* * *

Let me clarify one point of widespread confusion. Talk of ‘diets’, especially of the variety I’ve discussed here, are often thought of in terms of restriction and that word does come up quite a bit. I’m guilty of talking this way even in this post, as it is about impossible to avoid such language considering it is used in the scientific and medical literature. So, there is an implication of deprivation, of self-control and self-denial, as if we must struggle and suffer to be healthy. That couldn’t be further from the truth.

Once you are fat-adapted and have metabolic flexibility, you are less restricted than you were before, in that you can eat more carbs and sugars for a time and then more easily return back to ketosis, as is a common seasonal pattern for hunter-gatherers. And once you no longer are driven by food cravings and addictions, you’ll have a happier and healthier relationship to food — eating when genuinely hungry and going without for periods without irritation or weakness, as also is common among hunter-gatherers.

This is simply a return to the state in which most humans have existed for most of our historical and evolutionary past. It’s not restriction or deprivation, much less malnourishment. It’s normalcy or should be. But we need to remember what normalcy looks and feels like: “People around the world suffer from starvation and malnutrition, and it is not only because they lack food and nutrients. Instead they suffer from immature microbiomes, which can severely impact health” (AMI, The effects of fasting and starvation on the microbiome). Gut health is inseparable from the rest, and these diets heal and rebalance the gut.

We need to redefine what health means, in a society where sickness has become the norm.

* * *

Genius Foods:
Become Smarter, Happier, and More Productive While Protecting Your Brain for Life
by Max Lugavere

Baby Fat Isn’t Just Cute—It’s a Battery

Have you seen a baby lately? I’m talking about a newborn, fresh out of the womb. They’re fat. And cute. But mostly fat. Packed with stored energy prior to birth in the third trimester, the fatness of human babies is unprecedented in the mammal world. While the newborns of most mammal species average 2 to 3 percent of birth weight as body fat, humans are born with a body fat percentage of nearly 15, surpassing the fatness of even newborn seals. Why is this so? Because humans are born half-baked.

When a healthy human baby emerges from the womb, she is born physically helpless ad with an underdeveloped brain. Unlike most animals at birth, a newborn human is not equipped with a full catalogue of instincts preinstalled. It is estimated that if a human were to be born at a similar stage of cognitive development to a newborn chimp, gestation would be at least double the length (that doesn’t sound fun—am I right ladies?). By being born “prematurely,” human brains complete their development not in the womb, but in the real world, with open eyes and open ears—this is probably why we’re so social and smart! And it is during this period for rapid brain growth, what some refer to as the “fourth trimester,” that our fast serves as an important ketone reservoir for the brain, which can account for nearly 90 percent of the newborn’s metabolism. Now you know: baby fat isn’t just there for pinching. It’s there for the brain.

* * *

Mitochondria and the Future of Medicine:
The Key to Understanding Disease, Chronic Illness, Aging, and Life Itself
by Lee Know

Ketogenic Diets and Calorie Restriction

Ketone bodies, herein also referred to simply as ketones , are three water-soluble compounds that are produced as by-products when fatty acids are broken down for energy in the liver. These ketones can be used as a source of energy themselves, especially in the heart and brain, where they are a vital source of energy during periods of fasting.

The three endogenous ketones produced by the body are acetone , acetoacetic acid , and beta-hydroxybutyric acid (which is the only one that’s not technically a ketone, chemically speaking). They can be converted to acetyl-CoA, which then enters the TCA cycle to produce energy.

Fatty acids are so dense in energy, and the heart is one of the most energy-intensive organs, so under normal physiologic conditions, it preferentially uses fatty acids as its fuel source. However, under ketotic conditions, the heart can effectively utilize ketone bodies for energy.

The brain is also extremely energy-intensive, and usually relies on glucose for its energy. However, when glucose is in short supply, it gets a portion of its energy from ketone bodies (e.g., during fasting, strenuous exercise, low-carbohydrate, ketogenic diet, and in neonates). While most other tissues have alternate fuel sources (besides ketone bodies) when blood glucose is low, the brain does not. For the brain, this is when ketones become essential. After three days of low blood glucose, the brain gets 25 percent of its energy from ketone bodies. After about four days, this jumps to 70 percent!

In normal healthy individuals, there is a constant production of ketone bodies by the liver and utilization by other tissues. Their excretion in urine is normally very low and undetectable by routine urine tests. However, as blood glucose falls, the synthesis of ketones increases, and when it exceeds the rate of utilization, their blood concentration increases, followed by increased excretion in urine. This state is commonly referred to as ketosis , and the sweet, fruity smell of acetone in the breath is a common feature of ketosis.

Historically, this sweet smell was linked to diabetes and ketones were first discovered in the urine of diabetic patients in the mid-nineteenth century. For almost fifty years thereafter, they were thought to be abnormal and undesirable by-products of incomplete fat oxidation.

In the early twentieth century, however, they were recognized as normal circulating metabolites produced by the liver and readily utilized by the body’s tissues. In the 1920s, a drastic “hyperketogenic” diet was found to be remarkably effective for treating drug-resistant epilepsy in children. In 1967, circulating ketones were discovered to replace glucose as the brain’s major fuel during prolonged fasting. Until then, the adult human brain was thought to be entirely dependent upon glucose.

During the 1990s, diet-induced hyperketonemia (commonly called nutritional ketosis ) was found to be therapeutically effective for treating several rare genetic disorders involving impaired glucose utilization by nerve cells. Now, growing evidence suggests that mitochondrial dysfunction and reduced bioenergetic efficiency occur in brains of patients with Parkinson’s disease and Alzheimer’s disease. Since ketones are efficiently used by brain mitochondria for ATP generation and might also help protect vulnerable neurons from free-radical damage, ketogenic diets are being evaluated for their ability to benefit patients with Parkinson’s and Alzheimer’s diseases, and various other neurodegenerative disorders (with some cases reporting remarkable success).

There are various ways to induce ketosis, some easier than others. The best way is to use one of the various ketogenic diets (e.g., classic, modified Atkins, MCT or coconut oil, low-glycemic index diet), but calorie restriction is also proving its ability to achieve the same end results when carbohydrates are limited.

Features of Caloric Restriction

There are a number of important pieces to caloric restriction. First, and the most obvious, is that caloric intake is most critical. Typically, calories are restricted to about 40 percent of what a person would consume if food intake was unrestricted. For mice and rats, calorie restriction to this degree results in very different physical characteristics (size and body composition) than those of their control-fed counterparts. Regarding life extension, even smaller levels of caloric restriction (a reduction of only 10–20 percent of unrestricted calorie intake) produce longer-lived animals and disease-prevention effects.

In April of 2014, a twenty-five-year longitudinal study on rhesus monkeys showed positive results. The benefit of this study was that it was a long-term study done in primates—human’s closest relatives—and confirms positive data we previously saw from yeasts, insects, and rodents. The research team reported that monkeys in the control group (allowed to eat as much as they wanted) had a 2.9-fold increased risk of disease (e.g., diabetes) and a 3-fold increased risk of premature death, compared to calorie-restricted monkeys (that consumed a diet with 30 percent less calories).

If other data from studies on yeast, insects, and rodents can be confirmed in primates, it would indicate that calorie restriction could extend life span by up to 60 percent, making a human life span of 130–150 years a real possibility without fancy technology or supplements or medications. The clear inverse relationship between energy intake and longevity links its mechanism to mitochondria—energy metabolism and free-radical production.

Second, simply restricting the intake of fat, protein, or carbohydrates without overall calorie reduction does not increase the maximum life span of rodents. It’s the calories that count, not necessarily the type of calories (with the exception of those trying to reach ketosis, where type of calorie does count).

Third, calorie restriction has been shown to be effective in disease prevention and longevity in diverse species. Although most caloric restriction studies have been conducted on small mammals like rats or mice, caloric restriction also extends life span in single-celled protozoans, water fleas, fruit flies, spiders, and fish. It’s the only method of life extension that consistently achieves similar results across various species.

Fourth, these calorie-restricted animals stay “biologically younger” longer. Experimental mice and rats extended their youth and delayed (even prevented) most major diseases (e.g., cancers, cardiovascular diseases). About 90 percent of the age-related illnesses studied remained in a “younger” state for a longer period in calorie-restricted animals. Calorie restriction also greatly delayed cancers (including breast, colon, prostate, lymphoma), renal diseases, diabetes, hypertension, hyperlipidemia, lupus, and autoimmune hemolytic anemia, and a number of others.

Fifth, calorie restriction does not need to be started in early age to reap its benefits. Initiating it in middle-aged animals also slowed aging (this is good news for humans, because middle age is when most of us begin to think about our own health and longevity).

Of course, the benefits of calorie restriction relate back to mitochondria. Fewer calories mean less “fuel” (as electrons) entering the ETC, and a corresponding reduction in free radicals. As you know by now, that’s a good thing.

Health Benefits

As just discussed, new research is showing that judicious calorie restriction and ketogenic diets (while preserving optimal nutritional intake) might slow down the normal aging process and, in turn, boost cardiovascular, brain, and cellular health. But how? We can theorize that the restriction results in fewer free radicals, but one step in confirming a theory is finding its mechanism.

In particular, researchers have identified the beneficial role of beta-hydroxybutyric acid (the one ketone body that’s not actually a ketone). It is produced by a low-calorie diet and might be the key to the reduced risk of age-related diseases seen with calorie restriction. Over the years, studies have found that restricting calories slows aging and increases longevity, but the mechanism behind this remained elusive. New studies are showing that beta-hydroxybutyric acid can block a class of enzymes, called histone deacetylases , which would otherwise promote free-radical damage.

While additional studies need to be conducted, it is known that those following calorie-restricted or ketogenic diets have lower blood pressure, heart rate, and glucose levels than the general population. More recently, there has been a lot of excitement around intermittent fasting as an abbreviated method of achieving the same end results.

However, self-prescribing a calorie-restricted or ketogenic diet is not recommended unless you’ve done a lot of research on the topic and know what to do. If not done properly, these diets can potentially increase mental and physical stress on the body. Health status should be improving, not declining, as a result of these types of diets, and when not done properly, these diets could lead to malnutrition and starvation. Health care practitioners also need to properly differentiate a patient who is in a deficiency state of anorexia or bulimia versus someone in a healthy state of ketosis or caloric restriction.

I’ll add a final word of caution: While ketogenic diets can be indispensable tools in treating certain diseases, their use in the presence of mitochondrial disease—at this point—is controversial and depends on the individual’s specific mitochondrial disease. In some cases, a ketogenic diet can help; in others it can be deleterious. So, of all the therapies listed in this book, the one for which I recommend specific expertise in its application is this diet, and only after a proper diagnosis.

* * *

Grain Brain:
The Surprising Truth about Wheat, Carbs, and Sugar–Your Brain’s Silent Killers

by David Perlmutter

Caloric Restriction

Another epigenetic factor that turns on the gene for BDNF production is calorie restriction. Extensive studies have clearly demonstrated that when animals are on a reduced-calorie diet (typically reduced by around 30 percent), their brain production of BDNF shoots up and they show dramatic improvements in memory and other cognitive functions. But it’s one thing to read experimental research studies involving rats in a controlled environment and quite another to make recommendations to people based upon animal research. Fortunately, we finally have ample human studies demonstrating the powerful effect of reducing caloric intake on brain function, and many of these studies have been published in our most well-respected medical journals. 13

In January 2009, for example, the Proceedings of the National Academy of Science published a study in which German researchers compared two groups of elderly individuals—one that reduced their calories by 30 percent and another that was allowed to eat whatever they wanted. The researchers were interested in whether changes could be measured between the two groups’ memory function. At the conclusion of the three-month study, those who were free to eat without restriction experienced a small but clearly defined decline in memory function, while memory function in the group on the reduced-calorie diet actually increased, and significantly so. Knowing that current pharmaceutical approaches to brain health are very limited, the authors concluded, “The present findings may help to develop new prevention and treatment strategies for maintaining cognitive health into old age.” 14

Further evidence supporting the role of calorie restriction in strengthening the brain and providing more resistance to degenerative disease comes from Dr. Mark P. Mattson, chief of the Laboratory of Neurosciences at the National Institute on Aging (NIA). He reported:

Epidemiological data suggest that individuals with a low calorie intake may have a reduced risk of stroke and neurodegenerative disorders. There is a strong correlation between per capita food consumption and risk for Alzheimer’s disease and stroke. Data from population-based case control studies showed that individuals with the lowest daily calorie intakes had the lowest risk of Alzheimer’s disease and Parkinson’s disease. 15

Mattson was referring to a population-based longitudinal prospective study of Nigerian families, in which some members moved to the United States. Many people believe that Alzheimer’s disease is something you “get” from your DNA, but this particular study told a different story. It was shown that the incidence of Alzheimer’s disease among Nigerian immigrants living in the United States was increased compared to their relatives who remained in Nigeria. Genetically, the Nigerians who moved to America were the same as their relatives who remained in Nigeria. 16 All that changed was their environment—specifically, their caloric intake. The research clearly focused on the detrimental effects that a higher caloric consumption has on brain health. In a 2016 study published in Johns Hopkins Health Review, Mattson again emphasized the value of caloric restriction in warding off neurodegenerative diseases while at the same time improving memory and mood. 17 One way to do that is through intermittent fasting, which we’ll fully explore in chapter 7 . Another way, obviously, is to trim back your daily consumption.

If the prospect of reducing your calorie intake by 30 percent seems daunting, consider the following: On average, we consume 23 percent more calories a day than we did in 1970. 18 Based on data from the Food and Agriculture Organization of the United Nations, the average American adult consumes more than 3,600 calories daily. 19 Most would consider “normal” calorie consumption to be around 2,000 calories daily for women and 2,500 for men (with higher requirements depending on level of activity/exercise). A 30 percent cut of calories from an average of 3,600 per day equals 1,080 calories.

We owe a lot of our increased calorie consumption to sugar. Remember, the average American consumes roughly 163 grams (652 calories) of refined sugars a day—reflecting upward of a 30 percent hike in just the last three decades. 20 And of that amount, about 76 grams (302 calories) are from high-fructose corn syrup. So focusing on just reducing sugar intake may go a long way toward achieving a meaningful reduction in calorie intake, and this would obviously help with weight loss. Indeed, obesity is associated with reduced levels of BDNF, as is elevation of blood sugar. Remember, too, that increasing BDNF provides the added benefit of actually reducing appetite. I call that a double bonus.

But if the figures above still aren’t enough to motivate you toward a diet destined to help your brain, in many respects, the same pathway that turns on BDNF production can be activated by intermittent fasting (which, again, I’ll detail in chapter 7 ).

The beneficial effects in treating neurologic conditions using caloric restriction actually aren’t news for modern science, though; they have been recognized since antiquity. Calorie restriction was the first effective treatment in medical history for epileptic seizures. But now we know how and why it’s so effective: It confers neuroprotection, increases the growth of new brain cells, and allows existing neural networks to expand their sphere of influence (i.e., neuroplasticity).

While low caloric intake is well documented in relation to promoting longevity in a variety of species—including roundworms, rodents, and monkeys—research has also demonstrated that lower caloric intake is associated with a decreased incidence of Alzheimer’s and Parkinson’s disease. And the mechanisms by which we think this happens are via improved mitochondrial function and controlling gene expression.

Consuming fewer calories decreases the generation of free radicals while at the same time enhancing energy production from the mitochondria, the tiny organelles in our cells that generate chemical energy in the form of ATP (adenosine triphosphate). Mitochondria have their own DNA, and we know now that they play a key role in degenerative diseases such as Alzheimer’s and cancer. Caloric restriction also has a dramatic effect on reducing apoptosis, the process by which cells undergo self-destruction. Apoptosis happens when genetic mechanisms within cells are turned on that culminate in the death of that cell. While it may seem puzzling at first as to why this should be looked upon as a positive event, apoptosis is a critical cellular function for life as we know it. Pre-programmed cell death is a normal and vital part of all living tissues, but a balance must be struck between effective and destructive apoptosis. In addition, caloric restriction triggers a decrease in inflammatory factors and an increase in neuroprotective factors, specifically BDNF. It also has been demonstrated to increase the body’s natural antioxidant defenses by boosting enzymes and molecules that are important in quenching excessive free radicals.

In 2008, Dr. Veronica Araya of the University of Chile in Santiago reported on a study she performed during which she placed overweight and obese subjects on a three-month calorie-restricted diet, with a total reduction of 25 percent of calories. 21 She and her colleagues measured an exceptional increase in BDNF production, which led to notable reductions in appetite. It’s also been shown that the opposite occurs: BDNF production is decreased in animals on a diet high in sugar. 22 Findings like this have since been replicated.

One of the most well-studied molecules associated with caloric restriction and the growth of new brain cells is sirtuin-1 (SIRT1), an enzyme that regulates gene expression. In monkeys, increased SIRT1 activation enhances an enzyme that degrades amyloid—the starch-like protein whose accumulation is the hallmark of diseases like Alzheimer’s. 23 In addition, SIRT1 activation changes certain receptors on cells, leading to reactions that have the overall effect of reducing inflammation. Perhaps most important, activation of the sirtuin pathway by caloric restriction enhances BDNF. BDNF not only increases the number of brain cells, but also enhances their differentiation into functional neurons (again, because of caloric restriction). For this reason, we say that BDNF enhances learning and memory. 24

The Benefits of a Ketogenic Diet

While caloric restriction is able to activate these diverse pathways, which are not only protective of the brain but enhance the growth of new neuronal networks, the same pathway can be activated by the consumption of special fats called ketones. By far the most important fat for brain energy utilization is beta-hydroxybutyrate (beta-HBA), and we’ll explore this unique fat in more detail in the next chapter. This is why the so-called ketogenic diet has been a treatment for epilepsy since the early 1920s and is now being reevaluated as a therapeutic option in the treatment of Parkinson’s disease, Alzheimer’s disease, ALS, depression, and even cancer and autism. 25 It’s also showing promise for weight loss and ending type 2 diabetes. In mice models, the diet rescues hippocampal memory deficits, and extends healthy lifespan.

Google the term “ketogenic diet” and well over a million results pop up. Between 2015 and 2017, Google searches for the term “keto” increased ninefold. But the studies demonstrating a ketogenic diet’s power date back further. In one 2005 study, for example, Parkinson’s patients actually had a notable improvement in symptoms that rivaled medications and even brain surgery after being on a ketogenic diet for just twenty-eight days. 26 Specifically, consuming ketogenic fats (i.e., medium-chain triglycerides, or MCT oil) has been shown to impart significant improvement in cognitive function in Alzheimer’s patients. 27 Coconut oil, from which we derive MCTs, is a rich source of an important precursor molecule for beta-hydroxybutyrate and is a helpful approach to treating Alzheimer’s disease. 28 A ketogenic diet has also been shown to reduce amyloid in the brain, 29 and it increases glutathione, the body’s natural brain-protective antioxidant, in the hippocampus. 30 What’s more, it stimulates the growth of mitochondria and thus increases metabolic efficiency. 31

Dominic D’Agostino is a researcher in neuroscience, molecular pharmacology, and physiology at the University of South Florida. He has written extensively on the benefits of a ketogenic diet, and in my Empowering Neurologist interview with him he stated: “Research shows that ketones are powerful energy substrates for the brain and protect the brain by enhancing antioxidant defenses while suppressing inflammation. No doubt, this is why nutritional ketosis is something pharmaceutical companies are aggressively trying to replicate.” I have also done a lot of homework in understanding the brain benefits of ketosis—a metabolic state whereby the body burns fat for energy and creates ketones in the process. Put simply, your body is in a state of ketosis when it’s creating ketones for fuel instead of relying on glucose. And the brain loves it.

While science typically has looked at the liver as the main source of ketone production in human physiology, it is now recognized that the brain can also produce ketones in special cells called astrocytes. These ketone bodies are profoundly neuroprotective. They decrease free radical production in the brain, increase mitochondrial biogenesis, and stimulate production of brain-related antioxidants. Furthermore, ketones block the apoptotic pathway that would otherwise lead to self-destruction of brain cells.

Unfortunately, ketones have gotten a bad rap. I remember in my internship being awakened by a nurse to treat a patient in “diabetic ketoacidosis.” Physicians, medical students, and interns become fearful when challenged by a patient in such a state, and with good reason. It happens in insulin-dependent type 1 diabetics when not enough insulin is available to metabolize glucose for fuel. The body turns to fat, which produces these ketones in dangerously high quantities that become toxic as they accumulate in the blood. At the same time, there is a profound loss of bicarbonate, and this leads to significant lowering of the pH (acidosis). Typically, as a result, patients lose a lot of water due to their elevated blood sugars, and a medical emergency develops.

This condition is exceedingly rare, and again, it occurs in type 1 diabetics who fail to regulate their insulin levels. Our normal physiology has evolved to handle some level of ketones in the blood; in fact, we are fairly unique in this ability among our comrades in the animal kingdom, possibly because of our large brain-to-body weight ratio and the high energy requirements of our brain. At rest, 20 percent of our oxygen consumption is used by the brain, which represents only 2 percent of the human body. In evolutionary terms, the ability to use ketones as fuel when blood sugar was exhausted and liver glycogen was no longer available (during starvation) became mandatory if we were to survive and continue hunting and gathering. Ketosis proved to be a critical step in human evolution, allowing us to persevere during times of food scarcity. To quote Gary Taubes, “In fact, we can define this mild ketosis as the normal state of human metabolism when we’re not eating the carbohydrates that didn’t exist in our diets for 99.9 percent of human history. As such, ketosis is arguably not just a natural condition but even a particularly healthful one.” 32

There is a relationship between ketosis and calorie restriction, and the two can pack a powerful punch in terms of enhancing brain health. When you restrict calories (and carbs in particular) while upping fat intake, you trigger ketosis and increase levels of ketones in the blood. In 2012, when researchers at the University of Cincinnati randomly assigned twenty-three older adults with mild cognitive impairment to either a high-carbohydrate or very low-carbohydrate diet for six weeks, they documented remarkable changes in the low-carb group. 33 They observed not only improved verbal memory performance but also reductions in weight, waist circumference, fasting glucose, and fasting insulin. Now here’s the important point: “Ketone levels were positively correlated with memory performance.”

German researchers back in 2009 demonstrated in fifty healthy, normal to overweight elderly individuals that when calories were restricted along with a 20 percent increase in dietary fat, there was a measurable increase in verbal memory scores. 34 Another small study, yes, but their findings were published in the respected Proceedings of the National Academy of Sciences and spurred further research like that of the 2012 experiment. These individuals, compared to those who did not restrict calories, demonstrated improvements in their insulin levels and decline in their C-reactive protein, the infamous marker of inflammation. As expected, the most pronounced improvements were in people who adhered the most to the dietary challenge.

Research and interest in ketosis have exploded in recent years and will continue. The key to achieving ketosis, as we’ll see later in detail, is to severely cut carbs and increase dietary fat. It’s that simple. You have to be carb restricted if you want to reach this brain-blissful state.

* * *

Power Up Your Brain
by David Perlmutter and Alberto Villoldo

Your Brain’s Evolutionary Advantage

One of the most important features distinguishing humans from all other mammals is the size of our brain in proportion to the rest of our body. while it is certainly true that other mammals have larger brains, scientists recognize that larger animals must have larger brains simply to control their larger bodies. An elephant, for example, has a brain that weighs 7,500 grams, far larger than our 1,400-gram brain. So making comparisons about “brain power” or intelligence just based on brain size is obviously futile, Again, it’s the ratio of the brain size to total body size that attracts scientist’s interests when considering the brain’s functional capacity. An elephant’s brain represents 1/550 of its body weight, while the human brain weighs 1/40 of the total body weight. So our brain represents 2.5 percent of our total body weight as opposed to the large-brained elephant whose brain is just 0.18 percent of its total body weight.

But even more important than the fact that we are blessed with a lot of brain matter is the intriguing fact that, gram for gram, the human brain consumes a disproportionately huge amount of energy. While only representing 2.5 percent of our total body weight, the human brain consumes an incredible 22 percent of our body’s energy expenditure when at rest. this represents about 350 percent more energy consumption in relation to body weight compared with other anthropoids like gorillas, orangutans, and chimpanzees.

So it takes a lot of dietary calories to keep the human brain functioning. Fortunately, the very fact that we’ve developed such a large and powerful brain has provided us with the skills and intelligence to maintain adequate sustenance during times of scarcity and to make provisions for needed food supplies in the future. Indeed, the ability to conceive of and plan for the future is highly dependent upon the evolution not only of brain size but other unique aspects of the human brain.

It is a colorful image to conceptualize early Homo sapiens migrating across and arid plain and competing for survival among animals with smaller brains yet bigger claws and greater speed. But our earliest ancestors had one other powerful advantage compared to even our closest primate relatives. The human brain has developed a unique biochemical pathway that proves hugely advantageous during times of food scarcity. Unlike other mammals, our brain is able to utilize an alternative source of calories during times of starvation. Typically, we supply our brain with glucose form our daily food consumption. We continue to supply our brains with a steady stream of glucose (blood sugar) between meals by breaking down glycogen, a storage form of glucose primarily found in the liver and muscles.

But relying on glycogen stores provides only short-term availability of glucose. as glycogen stores are depleted, our metabolism shifts and we are actually able to create new molecules of glucose, a process aptly termed gluconeogenesis. this process involves the construction of new glucose molecules from amino acids harvested form the breakdown of protein primarily found in muscle. While gluconeogenesis adds needed glucose to the system, it does so at the cost of muscle breakdown, something less than favorable for a starving hunter-gatherer.

But human physiology offers one more pathway to provide vital fuel to the demanding brain during times of scarcity. When food is unavailable, after about three days the liver begins to use body fat to create chemicals called ketones. One ketone in particular, beta hydroxybutyrate (beta-HBA), actually serves as a highly efficient fuel source for the brain, allowing humans to function cognitively for extended periods during food scarcity.

Our unique ability to power our brains using this alternative fuel source helps reduce our dependence on gluconeogensis and therefore spares amino acids and the muscles they build and maintain. Reducing muscle breakdown provides obvious advantages for the hungry Homo sapiens in search of food. It is this unique ability to utilize beta-HBA as a brain fuel that sets us apart from our nearest animal relatives and has allowed humans to remain cognitively engaged and, therefore, more likely to survive the famines ever-present in our history.

This metabolic pathway, unique to Homo sapiens, may actually serve as an explanation for one of the most hotly debated questions in anthropology: what caused the disappearance of our Neanderthal relatives? Clearly, when it comes to brains, size does matter. Why then, with a brain some 20 percent larger than our own, did Neanderthals suddenly disappear in just a few thousand years between 40,000 and 30,000 years ago? the party line among scientists remains fixated on the notion that the demise of Neanderthals was a consequence of their hebetude, or mental lethargy. The neurobiologist William Calvin described Neanderthals in his book, A Brain for All Seasons: “Their way of life subjected them to more bone fractures; they seldom survived until forty years of age; and while making tools similar to [those of] overlapping species, there was little [of the] inventiveness that characterizes behaviorally modern Homo sapiens.”

While it is convenient and almost dogmatic to accept that Neanderthals were “wiped out” by clever Homo sapiens, many scientists now believe that food scarcity may have played a more prominent role in their disappearance. Perhaps the simple fact that Neanderthals, lacking the biochemical pathway to utilize beta-HBA as a fuel source for brain metabolism, lacked the “mental endurance” to persevere. Relying on gluconeogenesis to power their brains would have led to more rapid breakdown of muscle tissue, ultimately compromising their ability to stalk prey or migrate to areas where plant food sources were more readily available. their extinction may not have played out in direct combat with Homo sapiens but rather manifested as a consequence of a simple biochemical inadequacy.

Our ability to utilize beta-HBA as a brain fuel is far more important than simply a protective legacy of our hunter-gatherer heritage. George F. Cahill of Harvard Medical School stated, “Recent studies have shown that beta-hydroxybutyrate, the principle ‘ketone’ is not just a fuel, but a ‘superfuel’ more efficiently producing ATP energy than glucose. . . . It has also protected neuronal cells in tissue culture against exposure to toxins associated with Alzheimer’s or Parkinson’s.”

Indeed, well beyond serving as a brain superfuel, Dr. Cahill and other researchers have determined that beta-HBA has other profoundly positive effects on brain health and function. Essentially, beta-HBA is thought to mediate many of the positive effects of calorie reduction and fasting on the brain, including improved antioxidant function, increased mitochondrial energy production with an increased in mitochondrial energy production with an increase in mitochondrial population, increased cellular survival, and increased levels of BDNF leading to enhanced growth of new brain cells (neurogenesis).

Fasting

Earlier, we explored the need to reduce caloric intake in order to increase BDNF as a means to stimulate the growth of new brain cells as well as to enhance the function of existing neurons. The idea of substantially reducing daily calorie intake will not appeal to many people despite the fact that it is a powerful approach to brain enhancement as well as overall health.

Interestingly, however, many people find the idea of intermittent fasting to be more appealing. Fasting is defined here as a complete abstinence from food for a defined period of time at regular intervals—our fasting program permits the drinking of water. Research demonstrates that many of the same health-providing and brain-enhancing genetic pathways activated by calorie reduction are similarly engaged by fasting—even for relatively short periods of time. Fasting actually speaks to your DNA, directing your genes to produce an astounding array of brain-enhancement factors.

Not only does fasting turn on the genetic machinery for the production of BDNF, but it also powers up the Nrf2 pathway, leading to enhanced detoxification, reduced inflammation, and increased production of brain-protective antioxidants. Fasting causes the brain to shift away from using glucose as a fuel to a metabolism that consumes ketones. When the brain metabolizes ketones for fuel, even the process of apoptosis is reduced, while mitochondrial genes turn their attention to mitochondrial replication. In this way, fasting shifts the brain’s basic metabolism and specifically targets the DNA of mitochondria, thus enhancing energy production and paving the way for better brain function and clarity . . .

* * *

Insights into human evolution from ancient and contemporary microbiome studies
by Stephanie L Schnorr, Krithivasan Sankaranarayanan, Cecil M Lewis, Jr, and Christina Warinner

Brain growth, development, and behavior

The human brain is our defining species trait, and its developmental underpinnings are key foci of evolutionary genetics research. Recent research on brain development and social interaction in both humans and animal models has revealed that microbes exert a major impact on cognitive function and behavioral patterns []. For example, a growing consensus recognizes that cognitive and behavioral pathogenesis are often co-expressed with functional bowel disorders []. This hints at a shared communication or effector pathway between the brain and gut, termed the gutbrain-axis (GBA). The enteric environment is considered a third arm of the autonomic nervous system [], and gut microbes produce more than 90% of the body’s serotonin (5-hydroxytryptamine or 5-HT) []. Factors critical to learning and plasticity such as serotonin, γ-aminobutryic acid (GABA), short chain fatty acids (SCFAs), and brain derived neurotrophic factor (BDNF), which train amygdalin and hippocampal reactivity, can be mediated through gut-brain chemical signals that cross-activate bacterial and host receptors []. Probiotic treatment is associated with positive neurological changes in the brain such as increased BDNF, altered expression of GABA receptors, increased circulating glutathione, and a reduction in inflammatory markers. This implicates the gut microbiome in early emotional training as well as in affecting long-term cognitive plasticity.

Critically, gut microbiota can modulate synthesis of metabolites affecting gene expression for myelin production in the prefrontal cortex (PFC), presumably influencing the oligodendrocyte transcriptome []. Prosocial and risk associated behavior in probiotic treated mice, a mild analog for novelty-seeking and risk-seeking behaviors in humans, suggests a potential corollary between entrenched behavioral phenotypes and catecholamines (serotonin and dopamine) produced by the gut microbiota []. Evolutionary acceleration of the human PFC metabolome divergence from chimpanzees, particularly the dopaminergic synapse [], reifies the notion that an exaggerated risk-reward complex characterizes human cognitive differentiation, which is facilitated by microbiome derived bioactive compounds. Therefore, quintessentially human behavioral phenotypes in stress, anxiety, and novelty-seeking is additionally reinforced by microbial production of neuroactive compounds. As neurological research expands to include the microbiome, it is increasingly clear that host–microbe interactions have likely played an important role in human brain evolution and development [].

Ancient human microbiomes
by Christina Warinner, Camilla Speller, Matthew J. Collins, and Cecil M. Lewis, Jr

Need for paleomicrobiology data

Although considerable effort has been invested in characterizing healthy gut and oral microbiomes, recent investigations of rural, non-Western populations () have raised questions about whether the microbiota we currently define as normal have been shaped by recent influences of modern Western diet, hygiene, antibiotic exposure, and lifestyle (). The process of industrialization has dramatically reduced our direct interaction with natural environments and fundamentally altered our relationship with food and food production. Situated at the entry point of our food, and the locus of food digestion, the human oral and gut microbiomes have evolved under conditions of regular exposure to a diverse range of environmental and zoonotic microbes that are no longer present in today’s globalized food chain. Additionally, the foods themselves have changed from the wild natural products consumed by our hunter-gatherer ancestors to today’s urban supermarkets stocked with an abundance of highly processed Western foodstuffs containing artificially enriched levels of sugar, oil, and salt, not to mention antimicrobial preservatives, petroleum-based colorants, and numerous other artificial ingredients. This dietary shift has altered selection pressure on our microbiomes. For example, under the ‘ecological plaque hypothesis’, diseases such as dental caries and periodontal disease are described as oral ecological catastrophes of cultural and lifestyle choices ().

Although it is now clear that the human microbiome plays a critical role in making us human, in keeping us healthy, and in making us sick, we know remarkably little about the diversity, variation, and evolution of the human microbiome both today and in the past. Instead, we are left with many questions: When and how did our bacterial communities become distinctly human? And what does this mean for our microbiomes today and in the future? How do we acquire and transmit microbiomes and to what degree is this affected by our cultural practices and built environments? How have modern Western diets, hygiene practices, and antibiotic exposure impacted ‘normal’ microbiome function? Are we still in mutualistic symbiosis with our microbiomes, or are the so-called ‘diseases of civilization’ – heart disease, obesity, type II diabetes, asthma, allergies, osteoporosis – evidence that our microbiomes are out of ecological balance and teetering on dysbiosis ()? At an even more fundamental level, who are the members of the human microbiome, how did they come to inhabit us, and how long have they been there? Who is ‘our microbial self’ ()?

Studies of remote and indigenous communities () and crowdsourcing projects such as the American Gut (www.americangut.org), the Earth Microbiome Project (www.earthmicrobiome.org), and uBiome (www.uBiome.com) are attempting to characterize modern microbiomes across a range of contemporary environments. Nevertheless, even the most extensive sampling of modern microbiota will provide limited insight into Pre-Industrial microbiomes. By contrast, the direct investigation of ancient microbiomes from discrete locations and time points in the past would provide a unique view into the coevolution of microbes and hosts, host microbial ecology, and changing human health states through time. […]

Diet also plays a role in shaping the composition of oral microbiomes, most notably by the action of dietary sugar in promoting the growth of cariogenic bacteria such as lactobacilli and S. mutans (). Two recent papers have proposed that cariogenic bacteria, such as S. mutans, were absent in pre-Neolithic human populations, possibly indicating low carbohydrate diets (), while evolutionary genomic analyses of S. mutans suggest an expansion in this species approximately 10,000 years, coinciding with the onset of agriculture (). […]

Ancient microbiome research provides an additional pathway to understanding human biology that cannot be achieved by studies of extant individuals and related species alone. Although reconstructing the ancestral microbiome by studying our ancestors directly is not without challenges (), this approach provides a more direct picture of human-microbe coevolution. Likewise, ancient microbiome sources may reveal to what extent bacteria commonly considered ‘pathogenic’ in the modern world (for example, H. pylori) were endemic indigenous organisms in pre-Industrial microbiomes ().

The three paths to reconstructing the ancestral microbiomes are also complimentary. For example, analysis of the gut microbiome from extant, rural peoples in Africa and South America have revealed the presence of a common, potentially commensal, spirochete belonging to the genus Treponema (). Such spirochetes have also been detected in extant hunter-gatherers (), and in 1,000-year-old human coprolites from Mexico (), but they are essentially absent from healthy urban populations, and they have not been reported in the gut microbiome of chimpanzees (). These multiple lines of evidence suggest that this poorly understood spirochete is a member of the ancestral human microbiome, yet not necessarily the broader primate microbiome. Future coprolite research may be able to answer the question of how long this microbe has co-associated with humans, and what niche it fills.

* * *

Ketogenic Diet and Neurocognitive Health
Spartan Diet
The Agricultural Mind
Malnourished Americans

Neuroscientist Shows What Fasting Does To Your Brain & Why Big Pharma Won’t Study It
by Arjun Walia

Does Fasting Make You Smarter?
by Derek Beres

Fasting Cleans the Brain
by P. D. Mangan

How Fasting Heals Your Brain
by Adriana Ayales

Effect of Intermittent Fasting on Brain Neurotransmitters, Neutrophils Phagocytic Activity, and Histopathological Finding in Some Organs in Rats
by Sherif M. Shawky, Anis M. Zaid, Sahar H. Orabi, Khaled M. Shoghy, and Wafaa A. Hassan

The Effects of Fasting During Ramadan on the Concentration of Serotonin, Dopamine, Brain-Derived Neurotrophic Factor and Nerve Growth Factor
by Abdolhossein Bastani, Sadegh Rajabi, and Fatemeh Kianimarkani

Gut microbiome, SCFAs, mood disorders, ketogenic diet and seizures
by Jonathan Miller

Study: Ketogenic diet appears to prevent cognitive decline in mice
by University of Kentucky

Low-carb Diet Alleviates Inherited Form of Intellectual Disability in Mice
by Johns Hopkins Medicine

Ketogenic Diet Protects Against Alzheimer’s Disease by Keeping Your Brain Healthy and Youthful
by Joseph Mercola

The Gut Microbiota Mediates the Anti-Seizure Effects of the Ketogenic Diet.
by C. A. Olson

Is the Keto Diet Bad for the Microbiome?
by David Jockers

Does a Ketogenic Diet Change Our Microbiome?
by Christie Rice

Can Health Issues Be Solved By Dietary Changes Altering the Microbiome?
by Russ Schierling

Some Benefits of Intermittent Fasting are Mediated by the Gut Microbiome
by Fight Aging!

RHR: Is High Fat Healthy for the Gut Microbiota?
by Chris Kresser

A Comprehensive List of Low Carb Research
by Sarah Hallberg

Randomised Controlled Trials Comparing Low-Carb Diets Of Less Than 130g Carbohydrate Per Day To Low-Fat Diets Of Less Than 35% Fat Of Total Calories
from Public Health Collaboration

Carcinogenic Grains

In understanding human health, we have to look at all factors as a package deal. Our gut-brain is a system, as is our entire mind-body. Our relationships, lifestyle, the environment around us — all of it is inseparable. This is true even if we limit ourselves to diet alone. It’s not simply calories in/calories out, macronutrient ratios, or anything else along these lines. It is the specific foods eaten in combination with which other foods and in the context of stress, toxins, epigenetic inheritance, gut health, and so much else that determine what effects manifest in the individual.

There are numerous examples of this. But I’ll stick to a simple one, which involves several factors and the relationship between them. First, red meat is associated with cancer and heart disease. Yet causation is hard to prove, as red meat consumption is associated with many other foods in the standard American diet, such as added sugars and vegetable oils in processed foods. The association might be based on confounding factors that are culture-specific, which can explain why we find societies with heavy meat consumption and little cancer.

So, what else might be involved? We have to consider what red meat is being eaten with, at least in the standard American diet that is used as a control in most research. There is, of course, the added sugars and vegetable oils — they are seriously bad for health and may explain much of the confusion. Saturated fat intake has been dropping since the early 1900s and, in its place, there has been a steady rise in the use of vegetable oils; we now know that highly heated and hydrogenated vegetable oils do severe damage. Also, some of the original research that blamed saturated fat, when re-analyzed, found that sugar was the stronger correlation to heart disease.

Saturated fat, as with cholesterol, had been wrongly accused. This misunderstanding has, over multiple generations at this point, led to the early death of at least hundreds of millions of people worldwide, as dozens of the wealthiest and most powerful countries enforced this in their official dietary recommendations which transformed the world’s food system. Similar to eggs, red meat became the fall guy.

Such things as heart disease are related to obesity, and conventional wisdom tells us that fat makes us fat. Is that true? Not exactly or directly. I was amused to discover that a scientific report commissioned by the British government in 1846 (Experimental Researches on the Food of Animals, and the Fattening of Cattle: With Remarks on the Food of Man. Based Upon Experiments Undertaken by Order of the British Government by Robert Dundas Thomson) concluded that “The present experiments seem to demonstrate that the fat of animals cannot be produced from the oil of the food” — fat doesn’t make people fat, and that low-carb meat-eating populations tend to be slim has been observed for centuries.

So, in most cases, what does cause fat accumulation? It is only fat combined with plenty of carbs and sugar that is guaranteed to make us fat, that is to say fat in the presence of glucose in that the two compete as a fuel source.

Think about what an American meal with red meat looks like. A plate might have a steak with some rolls or slices of bread, combined with a potato and maybe some starchy ‘vegetables’ like corn, peas, or lima beans. Or there will be a hamburger with a bun, a side of fries, and a large sugary drink (‘diet’ drinks are no better, as we now know artificial sweeteners fool the body and so are just as likely to make you fat and diabetic). What is the common factor, red meat combined with wheat or some other grain, as part of a diet drenched in carbs and sugar (and all of it cooked or slathered in vegetable oils).

Most Americans have a far greater total intake of carbs, sugar, and vegetable oils than red meat and saturated fat. The preferred meat of Americans these days is chicken with fish also being popular. Why does red meat and saturated fat continue to be blamed for the worsening rates of heart disease and metabolic disease? It’s simply not rational, based on the established facts in the field of diet and nutrition. That isn’t to claim that too much red meat couldn’t be problematic. It depends on the total diet. Also, Americans have the habit of grilling their red meat and grilling increases carcinogens, which could be avoided by not charring one’s meat, but that equally applies to not burning (or frying) anything one eats, including white meat and plant foods. In terms of this one factor, you’d be better off eating beef roasted with vegetables than to go with a plant-based meal that included foods like french fries, fried okra, grilled vegetable shish kabobs, etc.

Considering all of that, what exactly is the cause of cancer that keeps showing up in epidemiological studies? Sarah Ballantyne has some good answers to that (see quoted passage below). It’s not so much about red meat itself as it is about what red meat is eaten with. The crux of the matter is that Americans eat more starchy carbs, mostly refined flour, than they do vegetables. What Ballantyne explains is that two of the potential causes of cancer associated with red meat only occur in a diet deficient in vegetables and abundant in grains. It is the total diet as seen in the American population that is the cause of high rates of cancer.

As a heavy meat diet without grains is not problematic, a heavy carb diet without grains is also not necessarily problematic. Some of the healthiest populations eat lots of carbs like sweet potatoes, but you won’t find any healthy population that eats as many grains as do Americans. There are many issues with grains considered in isolation (read the work of David Perlmutter or any number of writers on the paleo diet), but grains combined with certain other foods in particular can contribute to health concerns.

Then again, some of this is about proportion. For most of the time of agriculture, humans ate small amounts of grains as an occasional food. Grains tended to be stored for hard times or for trade or else turned into alcohol to be mixed with water from unclean sources. The shift to large amounts of grains made into refined flour is an evolutionarily unique dilemma our bodies aren’t designed to handle. The first accounts of white bread are found in texts from slightly over two millennia ago and most Westerners couldn’t afford white bread until the past few centuries when industrialized milling began. Before that, people tended to eat foods that were available and didn’t mix them as much (e.g., eat fruits and vegetables in season). Hamburgers were invented only about a century ago. The constant combining of red meat and grains is not something we are adapted for. That harm to our health results maybe shouldn’t surprise us.

Red meat can be a net loss to health or a net gain. It depends not on the red meat, but what is and isn’t eaten with it. Other factors matter as well. Health can’t be limited to a list of dos and don’ts, even if such lists have their place in the context of more detailed knowledge and understanding. The simplest solution is to eat as most humans ate for hundreds of thousands of years, and more than anything else that means avoiding grains. Even without red meat, many people have difficulties with grains.

Let’s return to the context of evolution. Hominids have been eating fatty red meat for millions of years (early humans having prized red meat from blubbery megafauna until their mass extinction), and yet meat-eating hunter-gatherers rarely get cancer, heart disease, or any of the other modern ailments. How long ago was it when the first humans ate grains? About 12 thousand years ago. Most humans on the planet never touched a grain until the past few millennia. And fewer still included grains with almost every snack and meal until the past few generations. So, what is this insanity of government dietary recommendations putting grains as the base of the food pyramid? Those grains are feeding the cancerous microbes, and doing much else that is harmful.

In conclusion, is red meat bad for human health? It depends. Red meat that is charred or heavily processed combined with wheat and other carbs, lots of sugar and vegetable oils, and few nutritious vegetables, well, that would be a shitty diet that will inevitably lead to horrible health consequences. Then again, the exact same diet minus the red meat would still be a recipe for disease and early death. Yet under other conditions, red meat can be part of a healthy diet. Even a ton of pasture-raised red meat (with plenty of nutrient-dense organ meats) combined with an equal amount of organic vegetables (grown on healthy soil, bought locally, and eaten in season), in exclusion of grains especially refined flour and with limited intake of all the other crap, that would be one of the healthiest diets you could eat.

On the other hand, if you are addicted to grains as many are and can’t imagine a world without them, you would be wise to avoid red meat entirely. Assuming you have any concerns about cancer, you should choose one or the other but not both. I would note, though, that there are many other reasons to avoid grains while there are no other known reasons to avoid red meat, at least for serious health concerns, although some people exclude red meat for other reasons such as digestion issues. The point is that whether or not you eat red meat is a personal choice (based on taste, ethics, etc), not so much a health choice, as long as we separate out grains. That is all we can say for certain based on present scientific knowledge.

* * *

We’ve known about this for years now. Isn’t it interesting that no major health organization, scientific institution, corporate news outlet, or government agency has ever warned the public about the risk factors of carcinogenic grains? Instead, we get major propaganda campaigns to eat more grains because that is where the profit is for big ag, big food, and big oil (that makes farm chemicals and transports the products of big ag and big food). How convenient! It’s nice to know that corporate profit is more important than public health.

But keep listening to those who tell you that cows are destroying the world, even though there are fewer cows in North America than there once were buffalo. Yeah, monocultural GMO crops immersed in deadly chemicals that destroy soil and deplete nutrients are going to save us, not traditional grazing land that existed for hundreds of millions of years. So, sure, we could go on producing massive yields of grains in a utopian fantasy beloved by technocrats and plutocrats that further disconnects us from the natural world and our evolutionary origins, an industrial food system dependent on turning the whole world into endless monocrops denatured of all other life, making entire regions into ecological deserts that push us further into mass extinction. Or we could return to traditional ways of farming and living with a more traditional diet largely of animal foods (meat, fish, eggs, dairy, etc) balanced with an equal amount of vegetables, the original hunter-gatherer diet.

Our personal health is important. And it is intimately tied to the health of the earth. Civilization as we know it was built on grains. That wasn’t necessarily a problem when grains were a small part of the diet and populations were small. But is it still a sustainable socioeconomic system as part of a healthy ecological system? No, it isn’t. So why do we continue to do more of the same that caused our problems in the hope that it will solve our problems? As we think about how different parts of our diet work together to create conditions of disease or health, we need to begin thinking this way about our entire world.

* * *

Paleo Principles
by Sarah Ballantyne

While this often gets framed as an argument for going vegetarian or vegan. It’s actually a reflection of the importance of eating plenty of plant foods along with meat. When we take a closer look at these studies, we see something extraordinarily interesting: the link between meat and cancer tends to disappear once the studies adjust for vegetable intake. Even more exciting, when we examine the mechanistic links between meat and cancer, it turns out that many of the harmful (yes, legitimately harmful!) compounds of meat are counteracted by protective compounds in plant foods.

One major mechanism linking meat to cancer involves heme, the iron-containing compound that gives red meat its color (in contrast to the nonheme iron found in plant foods). Where heme becomes a problem is in the gut: the cells lining the digestive tract (enterocytes) metabolize it into cytotoxic compounds (meaning toxic to living cells), which can then damage the gut barrier (specifically the colonic mucosa; see page 67), cause cell proliferation, and increase fecal water toxicity—all of which raise cancer risk. Yikes! In fact, part of the reason red meat is linked with cancer far more often than with white meat could be due to their differences in heme content; white meat (poultry and fish) contains much, much less.

Here’s where vegetables come to the rescue! Chlorophyll, the pigment in plants that makes them green, has a molecular structure that’s very similar to heme. As a result, chlorophyll can block the metabolism of heme in the intestinal tract and prevent those toxic metabolites from forming. Instead of turning into harmful by-products, heme ends up being metabolized into inert compounds that are no longer toxic or damaging to the colon. Animal studies have demonstrated this effect in action: one study on rats showed that supplementing a heme-rich diet with chlorophyll (in the form of spinach) completely suppressed the pro-cancer effects of heme. All the more reason to eat a salad with your steak.

Another mechanism involves L-carnitine, an amino acid that’s particularly abundant in red meat (another candidate for why red meat seems to disproportionately increase cancer risk compared to other meats). When we consume L-carnitine, our intestinal bacteria metabolize it into a compound called trimethylamine (TMA). From there, the TMA enters the bloodstream and gets oxydized by the liver into yet another compound, trimethylamine-N-oxide (TMAO). This is the one we need to pay attention to!

TMAO has been strongly linked to cancer and heart disease, possibly due to promoting inflammation and altering cholesterol transport. Having high levels of it in the bloodstream could be a major risk factor for some chronic diseases. So is this the nail in the coffin for meat eaters?

Not so fast! An important study on this topic published in 2013 in Nature Medicine sheds light on what’s really going on. This paper had quite a few components, but one of the most interesting has to do with gut bacteria. Basically, it turns out that the bacteria group Prevotella is a key mediator between L-carnitine consumption and having high TMAO levels in our blood. In this study, the researchers found that participants with gut microbiomes dominated by Prevotella produced the most TMA (and therefore TMAO, after it reached the liver) from the L-carnitine they ate. Those with microbiomes high in Bacteroides rather than Prevotella saw dramatically less conversion to TMA and TMAO.

Guess what Prevotella loves to snack on? Grains! It just so happens that people with high Prevotella levels, tend to be those who eat grain-based diets (especially whole grain), since this bacterial group specializes in fermenting the type of polysaccharides abundant in grain products. (For instance, we see extremely high levels of Prevotella in populations in rural Africa that rely on cereals like millet and sorghum.) At the same time, Prevotella doesn’t seem to be associated with a high intake of non-grain plant sources, such as fruit and vegetables.

So is it really the red meat that’s a problem . . . or is it the meat in the context of a grain-rich diet? Based on the evidence we have so far, it seems that grains (and the bacteria that love to eat them) are a mandatory part of the L-carnitine-to-TMAO pathway. Ditch the grains, embrace veggies, and our gut will become a more hospitable place for red meat!

* * *

Georgia Ede has a detailed article about the claim of meat causing cancer. In it, she provides several useful summaries of and quotes from the scientific literature.

WHO Says Meat Causes Cancer?

In November 2013, 23 cancer experts from eight countries gathered in Norway to examine the science related to colon cancer and red/processed meat. They concluded:

“…the interactions between meat, gut and health outcomes such as CRC [colorectal cancer] are very complex and are not clearly pointing in one direction….Epidemiological and mechanistic data on associations between red and processed meat intake and CRC are inconsistent and underlying mechanisms are unclear…Better biomarkers of meat intake and of cancer occurrence and updated food composition databases are required for future studies.” 1) To read the full report: http://www.ncbi.nlm.nih.gov/pubmed/24769880 [open access]

Translation: we don’t know if meat causes colorectal cancer. Now THAT is a responsible, honest, scientific conclusion.

How the WHO?

How could the WHO have come to such a different conclusion than this recent international gathering of cancer scientists? As you will see for yourself in my analysis below, the WHO made the following irresponsible decisions:

  1. The WHO cherry-picked studies that supported its anti-meat conclusions, ignoring those that showed either no connection between meat and cancer or even a protective effect of meat on colon cancer risk. These neutral and protective studies were specifically mentioned within the studies cited by the WHO (which makes one wonder whether the WHO committee members actually read the studies referenced in its own report).
  2. The WHO relied heavily on dozens of “epidemiological” studies (which by their very nature are incapable of demonstrating a cause and effect relationship between meat and cancer) to support its claim that meat causes cancer.
  3. The WHO cited a mere SIX experimental studies suggesting a possible link between meat and colorectal cancer, four of which were conducted by the same research group.
  4. THREE of the six experimental studies were conducted solely on RATS. Rats are not humans and may not be physiologically adapted to high-meat diets. All rats were injected with powerful carcinogenic chemicals prior to being fed meat. Yes, you read that correctly.
  5. Only THREE of the six experimental studies were human studies. All were conducted with a very small number of subjects and were seriously flawed in more than one important way. Examples of flaws include using unreliable or outdated biomarkers and/or failing to include proper controls.
  6. Some of the theories put forth by the WHO about how red/processed meat might cause cancer are controversial or have already been disproved. These theories were discredited within the texts of the very same studies cited to support the WHO’s anti-meat conclusions, again suggesting that the WHO committee members either didn’t read these studies or deliberately omitted information that didn’t support the WHO’s anti-meat position.

Does it matter whether the WHO gets it right or wrong about meat and cancer? YES.

“Strong media coverage and ambiguous research results could stimulate consumers to adapt a ‘safety first’ strategy that could result in abolishment of red meat from the diet completely. However, there are reasons to keep red meat in the diet. Red meat (beef in particular) is a nutrient dense food and typically has a better ratio of N6:N3-polyunsaturated fatty acids and significantly more vitamin A, B6 and B12, zinc and iron than white meat(compared values from the Dutch Food Composition Database 2013, raw meat). Iron deficiencies are still common in parts of the populations in both developing and industrialized countries, particularly pre-school children and women of childbearing age (WHO)… Red meat also contains high levels of carnitine, coenzyme Q10, and creatine, which are bioactive compounds that may have positive effects on health.” 2)

The bottom line is that there is no good evidence that unprocessed red meat increases our risk for cancer. Fresh red meat is a highly nutritious food which has formed the foundation of human diets for nearly two million years. Red meat is a concentrated source of easily digestible, highly bioavailable protein, essential vitamins and minerals. These nutrients are more difficult to obtain from plant sources.

It makes no sense to blame an ancient, natural, whole food for the skyrocketing rates of cancer in modern times. I’m not interested in defending the reputation of processed meat (or processed foods of any kind, for that matter), but even the science behind processed meat and cancer is unconvincing, as I think you’ll agree. […]

Regardless, even if you believe in the (non-existent) power of epidemiological studies to provide meaningful information about nutrition, more than half of the 29 epidemiological studies did NOT support the WHO’s stance on unprocessed red meat and colorectal cancer.

It is irresponsible and misleading to include this random collection of positive and negative epidemiological studies as evidence against meat.

The following quote is taken from one of the experimental studies cited by the WHO. The authors of the study begin their paper with this striking statement:

“In puzzling contrast with epidemiological studies, experimental studies do not support the hypothesis that red meat increases colorectal cancer risk. Among the 12 rodent studies reported in the literature, none demonstrated a specific promotional effect of red meat.” 3)

[Oddly enough, none of these twelve “red meat is fine” studies, which the authors went on to list and describe within the text of the introduction to this article, were included in the WHO report].

I cannot emphasize enough how common it is to see statements like this in scientific papers about red meat. Over and over again, researchers see that epidemiology suggests a theoretical connection between some food and some health problem, so they conduct experiments to test the theory and find no connection. This is why our nutrition headlines are constantly changing. One day eggs are bad for you, the next day they’re fine. Epidemiologists are forever sending well-intentioned scientists on time-consuming, expensive wild goose chases, trying to prove that meat is dangerous, when all other sources–from anthropology to physiology to biochemistry to common sense—tell us that meat is nutritious and safe.

Spartan Diet

There are a number well known low-carb diets. The most widely cited is that of the Inuit, but the Masai are often mentioned as well. I came across another example in Jack Weatherford’s Genghis Khan and the Making of the Modern World (see here for earlier discussion).

Mongols lived off of meat, blood, and milk paste. This diet, as the Chinese observed, allowed the Mongol warriors to ride and fight for days on end without needing to stop for meals. Part of this is because they could eat while riding, but there is a more key factor. This diet is so low-carb as to be ketogenic. And long-term ketosis leads to fat-adaptation which allows for high energy and stamina, even without meals as long as one has enough fat reserves (i.e., body fat). The feast and fast style of eating is common among non-agriculturalists.

There are other historical examples I haven’t previously researched. Ori Hofmekler in The Warrior Diet, claims that Spartans and Romans ate in a brief period each day, about a four hour window — because of the practice of having a communal meal once a day. This basically meant fasting for lengthy periods, although today it is often described as time-restricted eating. As I recall, Sikh monks have a similar practice of only eating one meal a day during which they are free to eat as much as they want. The trick to this diet is that it decreases overall food intake and keeps the body in ketosis more often — if starchy foods are restricted enough and the body is fat-adapted, this lessens hunger and cravings.

The Mongols may have been doing something similar. The thing about ketosis is your desire to snack all the time simply goes away. You don’t have to force yourself into food deprivation and it isn’t starvation, even if going without food for several days. As long as there is plenty of body fat and you are fat-adapted, the body maintains health, energy and mood just fine until the next big meal. Even non-warrior societies do this. The meat-loving and blubber-gluttonous Inuit don’t tolerate aggression in the slightest, and they certainly aren’t known for amassing large armies and going on military campaigns. Or consider the Piraha who are largely pacifists, banishing their own members if they kill another person, even someone from another tribe. The Piraha get about 70% of their diet from fish and other meat, that is to say a ketogenic diet. Plus, even though surrounded by lush forests filled with a wide variety of food, plants and animals, the Piraha regularly choose not to eat — sometimes for no particular reason but also sometimes when doing communal dances over multiple days.

So, I wouldn’t be surprised if Spartan and Roman warriors had similar practices, especially the Spartans who didn’t farm much (the grains that were grown by the Spartans’ slaves likely were most often fed to the slaves, not as much to the ruling Spartans). As for Romans, their diet probably became more carb-centric as Rome grew into an agricultural empire. But early on in the days of the Roman Republic, Romans probably were like Spartans in the heavy focus they would have put on raising cattle and hunting game. Still, a diet doesn’t have to be heavy in fatty meat to be ketogenic, as long as it involves some combination of calorie restriction, portion control, narrow periods of meals, intermittent fasting, etc — all being other ways of lessening the total intake of starchy foods.

One of the most common meals for Spartans was a blood and bone broth using boiled pork mixed with salt and vinegar, the consistency being thick and the color black. That would have included a lot of fat, fat-soluble vitamins, minerals, collagen, electrolytes, and much else. It was a nutrient-dense elixir of health, however horrible it may seem to the modern palate. And it probably was low-carb, depending on what else might’ve been added to it. Even the wine Spartans drink was watered down, as drunkenness was frowned upon. The purpose was probably more to kill unhealthy microbes in the water, as was watered down beer millennia later for early Americans, and so it would have added little sugar to the diet. Like the Mongols, they also enjoyed dairy. And they did have some grains such as bread, but apparently it was never a staple of their diet.

One thing they probably ate little of was olive oil, assuming it was used at all, as it was rarely mentioned in ancient texts and only became popular among Greeks in recent history, specifically the past century (discussed by Nina Teicholz in The Big Fat Surprise). Instead, Spartans as with most other early Greeks would have preferred animal fat, mostly lard in the case of the Spartans, whereas many other less landlocked Greeks preferred fish. Other foods the ancient Greeks, Spartans and otherwise, lacked was tomatoes later introduced from the New World and noodles later introduced from China, both during the colonial era of recent centuries. So, a traditional Greek diet would have looked far different than what we think of as the modern ‘Mediterranean diet’.

On top of that, Spartans were proud of eating very little and proud of their ability to fast. Plutarch (2nd century AD) writes in Parallel Lives “For the meals allowed them are scanty, in order that they may take into their own hands the fight against hunger, and so be forced into boldness and cunning”. Also, Xenophon who was alive whilst Sparta existed, writes in Spartan Society 2, “furnish for the common meal just the right amount for [the boys in their charge] never to become sluggish through being too full, while also giving them a taste of what it is not to have enough.” (from The Ancient Warrior Diet: Spartans) It’s hard to see how this wouldn’t have been ketogenic. Spartans were known for being great warriors achieving feats of military prowess that would’ve been impossible from lesser men. On their fatty meat diet of pork and game, they were taller and leaner than other Greeks. They didn’t have large meals and fasted for most of the day, but when they did eat it was food dense in fat, calories, and nutrition.

* * *

Ancient Spartan Food and Diet
from Legend & Chronicles

The Secrets of Spartan Cuisine
by Helena P. Schrader

Cold War Silencing of Science

In the early Cold War, the United States government at times was amazingly heavy-handed in its use of domestic power.

There was plenty of surveillance, of course. But there was also blatant propaganda with professors, journalists, and artists on the payroll of intelligence agencies, not to mention funding going to writing programs, American studies, etc. Worse still, there were such things as COINTELPRO, including truly effed up shit like the attempt to blackmail Martin Luther King, jr. into committing suicide. There is another angle to this. Along with putting out propaganda, they would do the opposite by trying to silence alternative voices and enforce conformity. They did that with the McCarthyist attacks on anyone perceived or falsely portrayed as deviant or as a fellow traveler of deviants. This destroyed careers and did successfully lead to some suicides of those devastated. But there was another kind of shutting down that I find sad as someone who affirms a free society as, among else, the free flow of information.

When Nikola Tesla died, the FBI swooped in and stole his research with no justification, as Tesla was a US citizen and such actions are both illegal and unconstitutional. They didn’t release his papers until 73 years later and no one knows if they released everything, as there is no transparency or accountability. One of the most famous examples is much more heinous. Wilhelm Reich was targeted by the American Medical Association, FDA, and FBI. The government arrested him and sentenced him to prison where he died. All of his journals and books were incinerated. In the end, the FDA had spent $2 million investigating and prosecuting Reich, simply because they didn’t like his research and of course his promoting sexual deviancy through free love.

These were not minor figures either. Nikola Tesla was one of the greatest scientists in the world and most definitely the greatest inventor in American history. And Wilhelm Reich was a famous doctor and psychoanalyst, an associate of Sigmund Freud and Carl Jung, and a well known writer. Their otherwise respectable positions didn’t protect them. Imagine what the government could get away with when they targeted average Americans with no one to protest and come to their defense. This same abuse of power was seen in related fields. A major focus of Reich’s work was health and, of course, he shared that area of concern with the FDA who saw it as their personal territory to rule as they wished. The FDA went after many with alternative health views that gained enough public attention and they could always find a reason to justify persecution.

I’ve come across examples in diet and nutrition, such as last year when I read Nina Planck’s Real Food where she writes about Adelle Davis, a biochemist and nutritionist who became a popular writer and gained celebrity as a public intellectual. Since she advocated a healthy diet of traditional foods, this put her in the cross-hairs of the powerful that sought to defend the standard American diet (SAD):

“My mother’s other nutritional hero was Adelle Davis, the best-selling writer who recommended whole foods and lots of protein. […] Davis had a master’s degree in biochemistry from the University of Southern California Medical School, but she wrote about nutrition in a friendly, common-sense style. In the 1950s and ’60s, titles like Let’s Eat Right to Keep Fit and Let’s Get Well became bestsellers. […] Like Price, Davis was controversial. “She so infuriated the medical profession and the orthodox nutrition community that they would stop at nothing to discredit her,” recalls my friend Joann Grohman, a dairy farmer and nutrition writer who says Adelle Davis restored her own health and that of her five young children. “The FDA raided health food stores and seized her books under a false labeling law because they were displayed next to vitamin bottles.” ”

In the same period during the 1950s and 1960s, the FDA went after Carlton Fredericks in an extended battle. He had a master’s degree and a doctorate in public health education and was a former associate professor. What was his horrific crime? He suggested that the modern food supply had become deficient in nutrients because of industrial processing and so that supplementation was necessary for health. It didn’t matter this was factually true. Fredericks’ mistake was stating such obvious truths openly on his radio show and in his written material. The FDA seized copies of Eat, Live and Be Merry (1961) for allegedly recommending the treatment of ailments “with vitamin and mineral supplements, which products are not effective in treating such conditions” (Congress 1965) which were “not effective”. They declared this as “false labeling”, despite it never contradicting any known science at the time or since. Then a few years later, the Federal Trade Commission brought a similar charge of false advertising in the selling of his tape-recorded programs and writing, but the allegations didn’t stick and the case was dropped.

A brief perusal of web search results brought up a similar case. Gayelord Hauser was a nutritionist with degrees in naturopathy and chiropractic who, like the others, became a popular writer — with multiple books translated into 12 languages and a regular column in Hearst newspapers read nationwide. What brought official ire down upon him was that he became so famous as to be befriended by numerous Hollywood actors, which elevated his popularity even further. Authority figures in the government and experts within the medical field saw him as a ‘quack’ and ‘food faddist’, which is to say as an ideological competitor who needed to be eliminated. His views worthy of being silenced included that American should eat more foods rich in B vitamins and to avoid sugar and white flour. As you can see, he was a monster and a public menace. This brought on the righteous wrath of the American Medical Association along with the flour and sugar lobbies. So, this led to an initial charge of practicing medicine without a license with products seized and destroyed. Later on, in recommending black-strap molasses as a nutrient-dense food which it is, the FDA made the standard accusation of product endorsement and false claims, and this was followed by the standard action of confiscating his 1950 best-selling book on healthy diet, Look Younger, Live Longer. Now Hauser is remembered by many as a pioneer in his field and as founder of the natural food movement.

Let me end with one last example of Cold War suppression. In reading Nina Teicholz’s The Big Fat Surprise, I noticed a brief reference to Herman Taller, a New York obstetrician and gynecologist. He too was an advocate of natural health. His book Calories Don’t Count got him into trouble for the same predictable reasons with claims of “false and misleading” labeling. He also sold supplements, but nothing bizarre — from bran fiber to safflower oil capsules, the latter being brought up in the legal case. His argument was that, since fish oil was healthy, other polyunsaturated fatty acids (PUFAs) would likewise be beneficial. It turns out he was wrong about safflower oil, but his scientific reasoning was sound for what was known at the time. His broader advocacy of a high fat diet with a focus on healthy fats has become mainstream since. Certain PUFAs, the omega-3 fats, are absolutely necessary for basic physiological functioning and indeed most people in the modern world do not get enough of them.

Anyway, it was never about fair-minded scientific inquiry and debate. So $30,000 worth of safflower‐oiI capsules and 1,600 copies of his book were taken from several warehouses. To justify this action, FDA Commissioner George P. Larrick stated that, “The book is full of false ideas, as many competent medical and nutritional writers have pointed out. Contrary to the book’s basic premise, weight reduction requires the reduction of caloric intake. There is no easy, simple substitute. Unfortunately, calories do count.” He decreed this from on high as the ultimate truth — the government would not tolerate anyone challenging this official ideology and yet scientists continue to debate the issue with recent research siding with Taller’s conclusion. According to the best science presently available, it is easy to argue that calories don’t count or, to put it another way, calorie-counting diets have proven a failure in study after study — a fact so well known that mainstream doctors and medical experts admit to its sad truth, even as they go on advising people to follow it and then blaming them for its failure.

If you’ve ever wondered how Ancel Keys’ weak evidence and bad science came to dominate as official dietary recommendations pushed by medical institutions, the federal government and the food industry, the above will give you some sense of the raw force of government authority that was used to achieve this end. It wasn’t only voices of popular writers and celebrity figures that were silenced, eliminated, and discredited. Gary Taubes and Nina Teicholz discuss how a related persecution happened within academia where independent researchers lost funding and no longer were invited to speak at conferences. For a half century, it was impossible to seriously challenge this behemoth of the dietary-industrial complex. And during this era, scientific research was stunted. This Cold War era oppression is only now beginning to thaw.

The Literal Metaphor of Sickness

I’ve written about Lenore Skenazy before. She is one of my mom’s favorite writers and so she likes to share the articles with me. Skenazy has a another piece about her usual topic, helicopter parents and their captive children. Today’s column, in the local newspaper (The Gazette), has the title “The irony of overprotection” (you can find it on the Creators website or from the GazetteXtra). She begins with a metaphor. In studying how leukemia is contracted, scientist Mel Greaves found that two conditions were required. The first is a genetic susceptibility, which exists only in a certain number of kids, although far from uncommon. But that alone isn’t sufficient without the second factor.

There has to be an underdeveloped or compromised immune system. And sadly this also has become far from uncommon. Further evidence of the hygiene hypothesis keeps accumulating (should be called the hygiene theory at this point). Basically, it is only by being exposed to germs that a child’s immune system experiences healthy stress that activates the immune system into normal development. Without this, many are left plagued by ongoing sickness, allergies, and autoimmune conditions for the rest of their lives.

Parents have not only protected their children from the larger dangers and infinite risks of normal childhood: skinned knees from roughhousing, broken limbs from falling from trees, hurt feelings from bullies, trauma from child molesters, murder from the roving bands of psychotic kidnappers who will sell your children on the black market, etc. Beyond such everyday fears, parents have also protected their kids from minor infections, with endless application of anti-bacterial products and cocooning them in sterile spaces that have been liberally doused with chemicals that kill all known microbial life forms. That is not a good thing for the consequences are dire.

This is where the metaphor kicks in. Skenazy writes:

The long-term effects? Regarding leukemia, “when such a baby is eventually exposed to common infections, his or her unprimed immune system reacts in a grossly abnormal way,” says Greaves. “It overreacts and triggers chronic inflammation.”

Regarding plain old emotional resilience, what we might call “psychological inflammation” occurs when kids overreact to an unfamiliar or uncomfortable situation because they have been so sheltered from these. They feel unsafe, when actually they are only unprepared, because they haven’t been allowed the chance to develop a tolerance for some fears and frustrations. That means a minor issue can be enough to set a kid off — something we are seeing at college, where young people are at last on their own. There has been a surge in mental health issues on campuses.

It’s no surprise that anxiety would be spiking in an era when kids have had less chance to deal with minor risks from childhood on up.

There is only a minor detail of disagreement I’d throw out. There is nothing metaphorical about this. Because of an antiseptic world and other causes (leaky gut, high-carb diet, sugar addiction, food additives, chemical exposure, etc), the immune systems of so many modern Americans are so dysfunctional and overreactive that it wreaks havoc on the body. Chronic inflammation has been directly linked to or otherwise associated with about every major health issue you can think of.

This includes, by the way, neurocognitive conditions such as depression and anxiety, but much worse as well. Schizophrenia, Alzheimer’s, etc also often involve inflammation. When inflammation gets into the brain, gut-brain axis, and/or nervous system, major problems follow with a diversity of symptoms that can be severe and life threatening, but they can also be problematic on a social and psychological level as well. This new generation of children are literally being brain damaged, psychologically maimed, and left in a fragile state. For many of them, their bodies and minds are not fully prepared to deal with the real world with normal healthy responses. It is hard to manage the stresses of life when one is in a constant state of low-grade sickness that permanently sets the immune system on high, when even the most minor risks could endanger one’s well being.

The least of our worries is the fact that diseases like type 2 diabetes, what used to be called adult onset diabetes because it was unknown among children, is now increasing among children. Sure, adult illnesses will find their way earlier and earlier into young adulthood and childhood and the diseases of the elderly will hit people in middle age or younger. This will be a health crisis that could bankrupt and cripple our society. But worse than that is the human cost of sickness and pain, struggle and suffering. We are forcing this fate onto the young generations. That is cruel beyond comprehension. We can barely imagine what this will mean across the entire society when it finally erupts as a crisis.

We’ve done this out of ignorant good intentions of wanting to protect our children from anything that could touch them. It makes us feel better that we have created a bubble world of innocence where children won’t have to learn from the mistakes and failures, harms and difficulties we experienced in growing up. So instead, we’ve created something far worse for them.