It’s long been understood that ketones, specifically beta-hydroxybutyrate (BHB), are a brain superfuel. It’s easy to explain the evolutionary reasons for this. Hunter-gatherers spent much more time in ketosis. This metabolic state tends to occur during periods of low food access. That happens in winter but also throughout the year, as hunter-gatherers tend toward a feast and fast pattern.
After a period of plenty, it might be a while until the next big kill. Some hunting expeditions could take days or weeks. They needed their brains working in top form for a successful hunt. Many hunter-gatherer tribes purposely fast on a regular basis as a demonstration of their toughness, to show that they can go long periods without food, a very important ability for hunters. Even tribal people living amongst food abundance will fast for no particular reason other than it’s part of their culture.
The Piraha, for example, can procure food easily and yet will go without, sometimes simply because they’d rather socialize around the village or are in the middle of communal dancing that can go on for days. They have better things to do than eat all the time. Besides, on a low-carb diet that is typical among hunter-gatherers, it takes little effort to fast. That is one of the benefits of ketosis, one’s appetite is naturally suppressed and so it effortlessly promotes caloric restriction.
Along with improving brain function, ketosis increases general health, probably including extending lifespan and certainly extending healthspan. Some of this could be explained by creating the conditions necessary for autophagy, although there are many other factors. An interesting example of this was shown in a mouse study.
The researchers exposed the rodents to influenza (E. L. Goldberg et al, Ketogenic diet activates protective γδ T cell responses against influenza virus infection). Most of the mice on a high-carb diet died, whereas most on the keto diet lived. In this case, it wasn’t the ketones themselves but other processes involved. Giving exogenous ketones as a supplement did not have the same effect as the body producing its own ketones. We typically think of ketosis only in terms of ketones, but obviously there is much more going on.
Still, in the case of neurocognitive functioning, the ketones themselves are key. It’s not only that they act as a superfuel but simultaneously alter epigenetic expression of specific genes related to memory. On the opposite side, research shows that feeding people sugar literally makes them dumber. Ketosis also decreases inflammation, including inflammation in the brain. Through multiple causal mechanisms, ketosis has been medically used as an effective treatment for numerous neurocognitive conditions: mood disorders, schizophrenia, autism, ADHD, Alzheimer’s, etc.
If ketosis is a biological indicator of food scarcity, why does the body expend extra energy that is limited? This seems counter-intuitive. Other species deal with food scarcity by shutting the body down and slowing metabolism, some going into full hibernation or semi-hibernation during winter. But humans do the opposite. Food scarcity increases physiological activity. In fact, ketosis is actually inefficient, as it burns more energy than is needed with the excess being put off as heat.
Benjamin Bikman, an insulin researcher, speculates this is because ketosis often happens in winter. Hibernating creatures lower their body temperature, but humans don’t have this capacity. Neither do we have thick fur. Humans need large amounts of heat to survive harsh winters. In ketosis, everything goes into overdrive: metabolism, immune system, and brain. This alters the epigenome itself that can be passed onto following generations.
“The Minnesota twin study raised questions about the depth and pervasiveness of qualities specified by genes: Where in the genome, exactly, might one find the locus of recurrent nightmares or of fake sneezes? Yet it provoked an equally puzzling converse question: Why are identical twins different? Because, you might answer, fate impinges differently on their bodies. One twin falls down the crumbling stairs of her Calcutta house and breaks her ankle; the other scalds her thigh on a tipped cup of coffee in a European station. Each acquires the wounds, calluses, and memories of chance and fate. But how are these changes recorded, so that they persist over the years? We know that the genome can manufacture identity; the trickier question is how it gives rise to difference.”
~Siddhartha Mukherjee, Same But Different
If genetics are the words in a dictionary, then epigenetics is the creative force that forms those words into a library of books. Even using the same exact words in the genomic code from identical twins, they can be expressed in starkly different ways. Each gene’s expression is dependent on it’s relationship to numerous other genes, potentially thousands, and all of those genes together are moderated according to epigenetics.
The epigenome itself can be altered by individual and environmental factors (type of work, exercise, and injuries; traumatic abuse, chronic stress, and prejudice; smoking, drinking, and malnutrition; clean or polluted air, water and soil; availability of green spaces, socioeconomic class, and level of inequality; etc). Then those changes can be passed on across multiple generations (e.g., the grandchildren of famine victims having higher obesity rates). This applies even to complex behaviors being inherited (e.g., the grandchildren of shocked mice, when exposed to cherry blossom scent, still jumping in response to the shock their grandparents experienced when exposed to the same scent).
What is rarely understood is that heritability rates don’t refer directly to genetics alone. It simply speaks to the entire package of influences. We don’t only inherit genes for we also inherit epigenetic markers and environmental conditions, all of the confounders that make twin studies next to useless. Heritability is only meaningful at a population level and can say nothing directly about individual people or individual factors such as a specific gene. And at a population level, research has shown that behavioral and cultural traits can persist over centuries, and they seem to have been originally caused by distant historical events of which the living memory has long since disappeared, but the memory lingers in some combination of heritable factors.
Even if epigenetics could only last several generations, though at least in some species much longer, the social conditions could continually reinforce those epigenetic changes so that they effectively become permanently set. And the epigenetics, in predisposing social behaviors, would create a vicious cycle of feeding back into the conditions that maintain the epigenetics. Or think of the centuries-long history of racism in the United States where evidence shows racism remains pervasive, systemic, and institutional, in which case the heritability is partly being enforced upon an oppressed underclass by those with wealth, privilege, and power. That wealth, power, and privilege is likewise heritable, as is the entire social order. No one part can be disentangled from the rest for none of us are separate from the world that we are born into.
Now consider any given disease, behavior, personality trait, etc might be determined by thousands of genes, thousands of epigenetic markers, and thousands of external factors. Change any single part of that puzzle might mean to rearrange the the entire result, even leading to a complete opposite expression. The epigenome determines not only if a gene is expressed but how it is expressed because it determines how which words are used in the genomic dictionary and how those words are linked into sentences, paragraphs, and chapters. So, one gene might be correlated as heritable with something in a particular society while correlated to something entirely else in a different society. The same gene could potentially have immense possible outcomes, in how the same word could be found in hundreds of thousands of books. Many of the same words are found in both Harry Potter and Hamlet, but that doesn’t help us to understand what makes one book different from the other. This is a useful metaphor, although an aspect of it might be quite literal considering what has been proven in the research on linguistic relativity.
There is no part of our lives not touched by language in shaping thought and affect, perception and behavior. Rather than a Chomskyan language organ that we inherit, maybe language is partly passed on through the way epigenetics ties together genes and environment. Even our scientific way of thinking about such issues probably leaves epigenetic markers that might predispose our children and grandchildren to think scientifically as well. What I’m describing in this post is a linguistically-filtered narrative upheld by a specific Jaynesian voice of authorization in our society. Our way of speaking and understanding changes us, even at a biological level. We are unable of standing back from the very thing about which we speak. In fact, it has been the language of scientific reductionism that has made it so difficult coming to this new insight into human nature, that we are complex beings in a complex world. And that scientific reduction has been a central component to the entire ruling paradigm, which continues to resist this challenging view.
Epigenetics can last across generations, but it can also be changed in a single lifetime. For centuries, we enforced upon the world, often violently and through language, an ideology of genetic determinism and race realism. The irony is that the creation of this illusion of an inevitable and unalterable social order was only possible through the elite’s control of environmental conditions and hence epigenetic factors. Yet as soon as this enforcement ends, the illusion drifts away like a fog dissipated by a strong wind and now through clear vision the actual landscape is revealed, a patchwork of possible pathways. We constantly are re-created by our inheritance, biological and environmental, and in turn we re-create the social order we find. But with new ways of speaking will come new ways of perceiving and acting in the world, and from that a different kind of society could form.
[This post is based on what is emerging in this area of research. But some of it remains speculative. Epigenetics, specifically, is still a young field. It’s difficult to detect and follow such changes across multiple generations. If and when someone proves that linguistic relativity can even reach to the level of the epigenome, a seeming inevitability (considering it’s already proven language alters behavior and behavior alters epigenetics), that could be the death blow to the already ailing essentialist paradigm (Essentialism On the Decline). According to the status quo, epigenetics is almost too radical to be believed, as is linguistic relativity. Yet we know each is true to a larger extent than present thought allows for. Combine the two and we might have a revolution of the mind.]
They note that researchers often assume that those thousands of weakly-acting genetic variants will all cluster together in relevant genes. For example, you might expect that height-associated variants will affect genes that control the growth of bones. Similarly, schizophrenia-associated variants might affect genes that are involved in the nervous system. “There’s been this notion that for every gene that’s involved in a trait, there’d be a story connecting that gene to the trait,” says Pritchard. And he thinks that’s only partly true.
Yes, he says, there will be “core genes” that follow this pattern. They will affect traits in ways that make biological sense. But genes don’t work in isolation. They influence each other in large networks, so that “if a variant changes any one gene, it could change an entire gene network,” says Boyle. He believes that these networks are so thoroughly interconnected that every gene is just a few degrees of separation away from every other. Which means that changes in basically any gene will ripple inwards to affect the core genes for a particular trait.
The Stanford trio call this the “omnigenic model.” In the simplest terms, they’re saying that most genes matter for most things.
More specifically, it means that all the genes that are switched on in a particular type of cell—say, a neuron or a heart muscle cell—are probably involved in almost every complex trait that involves those cells. So, for example, nearly every gene that’s switched on in neurons would play some role in defining a person’s intelligence, or risk of dementia, or propensity to learn. Some of these roles may be starring parts. Others might be mere cameos. But few genes would be left out of the production altogether.
This might explain why the search for genetic variants behind complex traits has been so arduous. For example, a giant study called… er… GIANT looked at the genomes of 250,000 people and identified 700 variants that affect our height. As predicted, each has a tiny effect, raising a person’s stature by just a millimeter. And collectively, they explain just 16 percent of the variation in heights that you see in people of European ancestry.
Over the past five years, Benjamin has been part of an international team of researchers identifying variations in the human genome that are associated with how many years of education people get. In 2013, after analyzing the DNA of 101,000 people, the team found just three of these genetic variants. In 2016, they identified 71 more after tripling the size of their study.
Now, after scanning the genomes of 1,100,000 people of European descent—one of the largest studies of this kind—they have a much bigger list of 1,271 education-associated genetic variants. The team—which includes Peter Visscher, David Cesarini, James Lee, Robbee Wedow, and Aysu Okbay—also identified hundreds of variants that are associated with math skills and performance on tests of mental abilities.
The team hasn’t discovered “genes for education.” Instead, many of these variants affect genes that are active in the brains of fetuses and newborns. These genes influence the creation of neurons and other brain cells, the chemicals these cells secrete, the way they react to new information, and the way they connect with each other. This biology affects our psychology, which in turn affects how we move through the education system.
This isn’t to say that staying in school is “in the genes.” Each genetic variant has a tiny effect on its own, and even together, they don’t control people’s fates. The team showed this by creating a “polygenic score”—a tool that accounts for variants across a person’s entire genome to predict how much formal education they’re likely to receive. It does a lousy job of predicting the outcome for any specific individual, but it can explain 11 percent of the population-wide variation in years of schooling.
That’s terrible when compared with, say, weather forecasts, which can correctly predict about 95 percent of the variation in day-to-day temperatures.
Each gene has a regulatory region that contains the instructions controlling when and where the gene is expressed. This gene regulatory code is read by proteins called transcription factors that bind to specific ‘DNA words’ and either increase or decrease the expression of the associated gene.
Under the supervision of Professor Jussi Taipale, researchers at Karolinska Institutet have previously identified most of the DNA words recognised by individual transcription factors. However, much like in a natural human language, the DNA words can be joined to form compound words that are read by multiple transcription factors. However, the mechanism by which such compound words are read has not previously been examined. Therefore, in their recent study in Nature, the Taipale team examines the binding preferences of pairs of transcription factors, and systematically maps the compound DNA words they bind to.
Their analysis reveals that the grammar of the genetic code is much more complex than that of even the most complex human languages. Instead of simply joining two words together by deleting a space, the individual words that are joined together in compound DNA words are altered, leading to a large number of completely new words.
“Our study identified many such words, increasing the understanding of how genes are regulated both in normal development and cancer,” says Arttu Jolma. “The results pave the way for cracking the genetic code that controls the expression of genes. “
Dr. Catherine Shanahan is a board-certified family physician with an undergraduate degree in biology, along with training in biochemistry and genetics. She has also studied ethno-botany, culinary traditions, and ancestral health. Besides regularly appearing in and writing for national media, she has worked as director and nutrition consultant for the Los Angeles Lakers. On High Intensity Health, she was interviewed by nutritionist Mike Mutzel (Fat Adapted Athletes Perform Better). At the 31:55 mark in that video, she discussed diet (in particular, industrial vegetable oils or simply seed oils), epigenetic inheritance, de novo genetic mutations, and autism. This can be found in the show notes (#172) where it is stated that,
“In 1909 we consumed 1/3 of an ounce of soy oil per year. Now we consume about 22 pounds per year. In the amounts that we consume seed oils, it breaks down into some of the worst toxins ever discovered. They are also capable of damaging our DNA. Many diseases are due to mutations that children have that their parents did not have. This means that mothers and fathers with poor diets have eggs/sperm that have mutated DNA. Children with autism have 10 times the number of usual mutations in their genes. Getting off of seed oils is one of the most impactful things prospective parents can do. The sperm has more mutations than the egg.”
These seed oils didn’t exist in the human diet until the industrial era. Our bodies are designed to use and incorporate the PUFAs from natural sources, but the processing into oils through high pressure and chemicals denatures the structure of the oil and destroys the antioxidants. The oxidative stress that follows from adding them to the diet is precisely because these altered oils act as trojan horses in being treated by the body like natural fats. This is magnified by a general increase of PUFAs, specifically omega-6 fatty acids, with a simultaneous decrease of omega-3 fatty acids and saturated fats. It isn’t any difference in overall fat intake, as the 40% we get in the diet now is about the same as seen in the diet at the beginning of last century. What is different is these oxidized PUFAs combined with massive loads of sugar and starches like never seen before.
Dr. Shanahan sees these industrial plant oils as the single greatest harm, such that she doesn’t consider them to be a food but a toxin, originally discovered as an industrial byproduct. She is less worried about any given category of food or macronutrient, as long as you first and foremost remove this specific source of toxins.** She goes into greater detail in a talk from Ancestry Foundation (AHS16 – Cate Shanahan – Bad Diet, Bad DNA?). And her book, Deep Nutrition, is a great resource on this topic. I’ll leave that for you to further explore, if you so desire. Let me quickly and simply note an implication of this.
Genetic mutations demonstrates how serious of a situation this is. The harm we are causing ourselves might go beyond merely punishment for our personal sins but the sins of the father and mother genetically passing onto their children, grandchildren, and further on (one generation of starvation or smoking among grandparents leads to generations of smaller birth weight and underdevelopment among the grandchildren and maybe beyond, no matter if the intervening generation of parents was healthy).
It might not be limited to a temporary transgenerational harm as seen with epigenetics. This could be permanent harm to our entire civilization, fundamentally altering our collective gene pool. We could recover from epigenetics within a few generations, assuming we took the problem seriously and acted immediately (Dietary Health Across Generations), but with genetic mutations we may never be able to undo the damage. These mutations have been accumulating and will continue to accumulate, until we return to an ancestral diet of healthy foods as part of an overall healthy lifestyle and environment. Even mutations can be moderated by epigenetics, as the body is designed to deal with them.
This further undermines genetic determinism and biological essentialism. We aren’t mere victims doomed to a fate beyond our control. This dire situation is being created by all of us, individually and collectively. There is no better place to begin than with your own health, but we better also treat this as a societal crisis verging on catastrophe. It was public policies and an international food system that created the conditions that enacted and enforced this failed mass experiment of dietary dogma and capitalist realist profiteering. Maybe we could try something different, something less psychopathically authoritarian, less psychotically disconnected from reality, less collectively suicidal. Heck, it’s worth a try.
* * *
** I’d slightly disagree with her emphasis. She thinks what matters most is the changes over the past century. There is a good point made in this focus on late modernity. But I’d note that industrialization and modern agriculture began in the prior centuries.
It was in the colonial era that pasta was introduced to Italy, potatoes to Ireland, and sugar throughout the Western world. It wasn’t until the late 1700s and more clearly in the early 1800s that there were regular grain surpluses that made grains available for feeding/fattening both humans and cattle. In particular, it was around this time that agricultural methods improved for wheat crops, allowing it to be affordable to the general public for the first time in human existence and hence causing white bread to become common during the ensuing generations.
I don’t know about diseases like Alzheimer’s, Parkinson’s, and multiple sclerosis. But I do know that the most major diseases of civilization (obesity, diabetes, cancer, and mental illness) were first noticed to be on the rise during the 1700s and 1800s or sometimes earlier, long before industrial oils or the industrial revolution that made these oils possible. The high-carb diet appeared gradually with colonial trade and spread across numerous societies, first hitting the wealthiest before eventually being made possible for the dirty masses. During this time, it was observed by doctors, scientists, missionaries and explorers that obesity, diabetes, cancer, mental illness and moral decline quickly followed on the heels of this modern diet.
Seed oils were simply the final Jenga block pulled out from the ever growing and ever more wobbly tower, in replacing healthy nutrient-dense animal fats (full of fat-soluble vitamins, choline, omega-3 fatty acids, etc) that were counterbalancing some of the worst effects of the high-carb diet. But seed oils, as with farm chemicals such as glyphosate, never would never have had as severe and dramatic of an impact if not for the previous centuries of worsening diet and health. It had been building up over a long time and it was doomed to topple right from the start. We are simply now at the tipping point that is bringing us to the culmination point, the inevitable conclusion of a sad trajectory.
Still, it’s never too late… or let us hope. Dr. Shanahan prefers to end on an optimistic note. And I’d rather not disagree with her about that. I’ll assume she is right or that she is at least in the general ballpark. Let us do as she suggests. We need more and better research, but somehow industrial seed oils have slipped past the notice of autism researchers.
Dr. Cate: Genetic Wealth is the idea that if your parents or grandparents ate traditional and nutrient-rich foods, then you came into the world with genes that could express in an optimal way, and this makes you more likely to look like a supermodel and be an extraordinary athlete. Take Angelina Jolie or Michael Jordan, for instance. They’ve got loads of genetic wealth.
Genetic Momentum describes the fact that, once you have that extraordinary genetic wealth, you don’t have to eat so great to be healthier than the average person. It’s like being born into a kind of royalty. You always have that inheritance around and you don’t need to work at your health in the same way other people do.
These days, for most of us, it was our grandparents or great grandparents who were the last in our line to grow up on a farm or get a nutrient-rich diet. In my case, I have to go back 4 generations to the Irish and Russian farmers who immigrated to NYC where my grandparents on both sides could only eat cheap food; sometimes good things like chopped liver and beef tongue, but often preserves and crackers and other junk. So my grandparents were far healthier than my brother and sisters and I.
The Standard American Diet (SAD) has accelerated the processes of genetic wealth being spent down, genetic momentum petering out, and the current generation getting sick earlier than their parents and grandparents. This is a real, extreme tragedy on the order of end-of-the-world level losses of natural resources. Genetic wealth is a kind of natural resource. And loss of genetic wealth is a more urgent problem than peak oil or the bursting of the housing bubble. But of course nobody is talking about it directly, only indirectly, in terms of increased rates of chronic disease.
Take autism, for example. Why is autism so common? I don’t think vaccines are the reason for the vast vast majority of cases, since subtle signs of autism can be seen before vaccination in the majority. I think the reason has to do with loss of genetic wealth. We know that children with autism exhibit DNA mutations that their parents and grandparents did not have. Why? Because in the absence of necessary nutrients, DNA cannot even duplicate itself properly and permanent mutations develop.
(Here’s an article on one kind of genetic mutation (DNA deletions) associated with autism.)
Fortunately, most disease is not due to permanent letter mutations and therefore a good diet can rehabilitate a lot of genetic disease that is only a result of altered genetic expression. To put your high-school biology to work, it’s the idea of genotype versus phenotype. You might have the genes that make you prone to, for example, breast cancer (the BRCA1 mutation), but you might not get the disease if you eat right because the gene expression can revert back to normal.
Deep Nutrition: Why Your Genes Need Traditional Food by Dr. Catherine Shanahan
In 2007, a consortium of geneticists investigating autism boldly announced that the disease was not genetic in the typical sense of the word, meaning that you inherit a gene for autism from one or both of your parents. New gene sequencing technologies had revealed that many children with autism had new gene mutations, never before expressed in their family line.
An article published in the prestigious journal Proceedings of the National Academy of Sciences states, “The majority of autisms are a result of de novo mutations, occurring first in the parental germ line.” 42 The reasons behind this will be discussed in Chapter 9.
In 2012, a group investigating these new, spontaneous mutations discovered evidence that randomness was not the sole driving force behind them. Their study, published in the journal Cell, revealed an unexpected pattern of mutations occurring 100 times more often in specific “hotspots,” regions of the human genome where the DNA strand is tightly coiled around organizing proteins called histones that function much like spools in a sewing kit, which organize different colors and types of threads. 43
The consequences of these mutations seem specifically designed to toggle up or down specific character traits. Jonathan Sebat, lead author on the 2012 article, suggests that the hotspots are engineered to “mutate in ways that will influence human traits” by toggling up or down the development of specific behaviors. For example, when a certain gene located at a hotspot on chromosome 7 is duplicated, children develop autism, a developmental delay characterized by near total lack of interest in social interaction. When the same chromosome is deleted, children develop Williams Syndrome, a developmental delay characterized by an exuberant gregariousness, where children talk a lot, and talk with pretty much anyone. The phenomenon wherein specific traits are toggled up and down by variations in gene expression has recently been recognized as a result of the built-in architecture of DNA and dubbed “active adaptive evolution.” 44
As further evidence of an underlying logic driving the development of these new autism-related mutations, it appears that epigenetic factors activate the hotspot, particularly a kind of epigenetic tagging called methylation. 45 In the absence of adequate B vitamins, specific areas of the gene lose these methylation tags, exposing sections of DNA to the factors that generate new mutations. In other words, factors missing from a parent’s diet trigger the genome to respond in ways that will hopefully enable the offspring to cope with the new nutritional environment. It doesn’t always work out, of course, but that seems to be the intent.
You could almost see it as the attempt to adjust character traits in a way that will engineer different kinds of creative minds, so that hopefully one will give us a new capacity to adapt.
What Is Autism?
The very first diagnostic manual for psychiatric disorders published in 1954 described autism simply as “schizophrenic reaction, childhood type.” 391 The next manual, released in 1980, listed more specific criteria, including “pervasive lack of responsiveness to other people” and “if speech is present, peculiar speech patterns such as immediate and delayed echolalia, metaphorical language, pronominal reversal (using you when meaning me, for instance).” 392 Of course, the terse language of a diagnostic manual can never convey the real experience of living with a child on the spectrum, or living on the spectrum yourself.
When I graduated from medical school, autism was so rarely diagnosed that none of my psychiatry exams even covered it and I and my classmates were made aware of autism more from watching the movie Rain Man than from studying course material. The question of whether autism (now commonly referred to as ASD) is more common now than it was then or whether we are simply recognizing it more often is still controversial. Some literature suggests that it is a diagnostic issue, and that language disorders are being diagnosed less often as autism is being diagnosed more. However, according to new CDC statistics, it appears that autism rates have risen 30 percent between 2008 and 2012. Considering that diagnostic criteria had been stable by that point in time for over a decade, increased diagnosis is unlikely to be a major factor in this 30 percent figure. 393
Given these chilling statistics, it’s little wonder that so many research dollars have been dedicated to exploring possible connections between exposure to various environmental factors and development of the disorder. Investigators have received grants to look into a possible link between autism and vaccines, 394 smoking, 395 maternal drug use (prescription and illicit), 396 , 397 , 398 organophosphates, 399 and other pesticides, 400 BPA, 401 lead, 402 mercury, 403 cell phones, 404 IVF and infertility treatments, 405 induced labor, 406 high-powered electric wires, 407 flame retardants, 408 ultrasound, 409 —and just about any other environmental factor you can name. You might be wondering if they’ve also looked into diet. But of course: alcohol, 410 cow’s milk, 411 milk protein, 412 soy formula, 413 gluten, 414 and food colorings 415 have all been investigated. Guess what they’ve never dedicated a single study to investigating? Here’s a hint: it’s known to be pro-oxidative and pro-inflammatory and contains 4-HNE, 4-HHE, and MDA, along with a number of other equally potent mutagens. 416 Still haven’t guessed? Okay, one last hint: it’s so ubiquitous in our food supply that for many Americans it makes up as much as 60 percent of their daily caloric intake, 417 a consumption rate that has increased in parallel with rising rates of autism.
Of course, I’m talking about vegetable oil. In Chapter 2 , I discussed in some detail how and why gene transcription, maintenance, and expression are necessarily imperiled in the context of a pro-inflammatory, pro-oxidative environment, so I won’t go further into that here. But I do want to better acquaint you with the three PUFA-derived mutagens I just named because when they make it to the part of your cell that houses DNA, they can bind to DNA and create new, “de novo,” mutations. DNA mutations affecting a woman’s ovaries, a man’s sperm, or a fertilized embryo can have a devastating impact on subsequent generations.
First, let’s revisit 4-HNE (4-hydroxynonanol), which you may recall meeting in the above section on firebombing the highways. This is perhaps the most notorious of all the toxic fats derived from oxidation of omega-6 fatty acids, whose diversity of toxic effects requires that entire chemistry journals be devoted to 4-HNE alone. When the mutagenicity (ability to mutate DNA) of 4-HNE was first described in 1985, the cytotoxicity (ability to kill cells) had already been established for decades. The authors of a 2009 review article explain that the reason it had taken so long to recognize that HNE was such an effective carcinogen was largely due to the fact that “the cytotoxicity [cell-killing ability] of 4-HNE masked its genotoxicity [DNA-mutating effect].” 419 In other words, it kills cells so readily that they don’t have a chance to divide and mutate. How potently does 4-HNE damage human DNA? After interacting with DNA, 4-HNE forms a compound called an HNE-adduct, and that adduct prevents DNA from copying itself accurately. Every time 4-HNE binds to a guanosine (the G of the four-letter ACGT DNA alphabet), there is somewhere between a 0.5 and 5 percent chance that G will not be copied correctly, and that the enzyme trying to make a perfect copy of DNA will accidentally turn G into T. 420 Without 4-HNE, the chance of error is about a millionth of a percent. 421 In other words, 4-HNE increases the chances of a DNA mutation rate roughly a million times!
Second, 4-HHE (4-hydroxy-hexanal), which is very much like 4-HNE, his more notorious bigger brother derived from omega-6, but 4-HHE is derived instead from omega-3. If bad guys had sidekicks, 4-NHE’s would be 4-HHE. Because 4-HHE does many of the same things to DNA as 4-HNE, but has only been discovered recently. 422 You see, when omega-6 reacts with oxygen, it breaks apart into two major end products, whereas omega-3, being more explosive, flies apart into four different molecules. This means each one is present in smaller amounts, and that makes them a little more difficult to study. But it doesn’t make 4-HHE any less dangerous. 4-HHE specializes in burning through your glutathione peroxidase antioxidant defense system. 423 This selenium-based antioxidant enzyme is one of the three major enzymatic antioxidant defense systems, and it may be the most important player defending your DNA against oxidative stress. 424 , 425
Finally, there is malonaldehyde (MDA), proven to be a mutagen in 1984, but presumed to only come from consumption of cooked and cured meats. 426 Only in the past few decades have we had the technology to determine that MDA can be generated in our bodies as well. 427 And unlike the previous two chemicals, MDA is generated by oxidation of both omega-3 and omega-6. It may be the most common endogenously derived oxidation product. Dr. J. L. Marnett, who directs a cancer research lab at Vanderbuit University School of Medicine, Nashville, Tennessee, and who has published over 400 articles on the subject of DNA mutation, summarized his final article on MDA with the definitive statement that MDA “appears to be a major source of endogenous DNA damage [endogenous, here, meaning due to internal, metabolic factors rather than, say, radiation] in humans that may contribute significantly to cancer and other genetic diseases.” 428
There’s one more thing I need to add about vegetable-oil-derived toxic breakdown products, particularly given the long list of toxins now being investigated as potential causes of autism spectrum disorders. Not only do they directly mutate DNA, they also make DNA more susceptible to mutations induced by other environmental pollutants. 429 , 430 This means that if you start reading labels and taking vegetable oil out of your diet, your body will more readily deal with the thousands of contaminating toxins not listed on the labels which are nearly impossible to avoid.
Why all this focus on genes when we’re talking about autism? Nearly every day a new study comes out that further consolidates the consensus among scientists that autism is commonly a genetic disorder. The latest research is focusing on de novo mutations, meaning mutations neither parent had themselves but that arose spontaneously in their egg, sperm, or during fertilization. These mutations may affect single genes, or they may manifest as copy number variations, in which entire stretches of DNA containing multiple genes are deleted or duplicated. Geneticists have already identified a staggering number of genes that appear to be associated with autism. In one report summarizing results of examining 900 children, scientists identified 1,000 potential genes: “exome sequencing of over 900 individuals provided an estimate of nearly 1,000 contributing genes.” 431
All of these 1,000 genes are involved with proper development of the part of the brain most identified with the human intellect: our cortical gray matter. This is the stuff that enables us to master human skills: the spoken language, reading, writing, dancing, playing music, and, most important, the social interaction that drives the desire to do all of the above. One need only have a few of these 1,000 genes involved in building a brain get miscopied, or in some cases just one, in order for altered brain development to lead to one’s inclusion in the ASD spectrum.
So just a few troublemaker genes can obstruct the entire brain development program. But for things to go right, all the genes for brain development need to be fully functional.
Given that humans are thought to have only around 20,000 genes, and already 1,000 are known to be essential for building brain, that means geneticists have already labeled 5 percent of the totality of our genetic database as crucial to the development of a healthy brain—and we’ve just started looking. At what point does it become a foolish enterprise to continue to look for genes that, when mutated, are associated with autism? When we’ve identified 5,000? Or 10,000? The entire human genome? At what point do we stop focusing myopically only on those genes thought to play a role in autism?
I’ll tell you when: when you learn that the average autistic child’s genome carries de novo mutations not just in genes thought to be associated with autism, but across the board, throughout the entirety of the chromosomal landscape. Because once you’ve learned this, you can’t help but consider that autism might be better characterized as a symptom of a larger disease—a disease that results in an overall increase in de novo mutations.
Almost buried by the avalanche of journal articles on genes associated with autism is the finding that autistic children exhibit roughly ten times the number of de novo mutations compared to their typically developing siblings. 432 An international working group on autism pronounced this startling finding in a 2013 article entitled: “Global Increases in Both Common and Rare Copy Number Load Associated With Autism.” 433 ( Copy number load refers to mutations wherein large segments of genes are duplicated too often.) What the article says is that yes, children with autism have a larger number of de novo mutations, but the majority of their new mutations are not statistically associated with autism because other kids have them, too. The typically developing kids just don’t have nearly as many.
These new mutations are not only affecting genes associated with brain development. They are affecting all genes seemingly universally. What is more, there is a dose response relationship between the total number of de novo mutations and the severity of autism such that the more gene mutations a child has (the bigger the dose of mutation), the worse their autism (the larger the response). And it doesn’t matter where the mutations are located—even in genes that have no obvious connection to the brain. 434 This finding suggests that autism does not originate in the brain, as has been assumed. The real problem—at least for many children—may actually be coming from the genes. If this is so, then when we look at a child with autism, what we’re seeing is a child manifesting a global genetic breakdown. Among the many possible outcomes of this genetic breakdown, autism may simply be the most conspicuous, as the cognitive and social hallmarks of autism are easy to recognize.
As the authors of the 2013 article state, “Given the large genetic target of neurodevelopmental disorders, estimated in the hundreds or even thousands of genomic loci, it stands to reason that anything that increases genomic instability could contribute to the genesis of these disorders.” 435 Genomic instability —now they’re on to something. Because framing the problem this way helps us to ask the more fundamental question, What is behind the “genomic instability” that’s causing all these new gene mutations?
In the section titled “What Makes DNA Forget” in Chapter 2 , I touched upon the idea that an optimal nutritional environment is required to ensure the accurate transcription of genetic material and communication of epigenetic bookmarking, and how a pro-oxidative, pro-inflammatory diet can sabotage this delicate operation in ways that can lead to mutation and alter normal growth. There I focused on mistakes made in epigenetic programming, what you could call de novo epigenetic abnormalities. The same prerequisites that support proper epigenetic data communication, I submit, apply equally to the proper transcription of genetic data.
What’s the opposite of a supportive nutritional environment? A steady intake of pro-inflammatory, pro-oxidative vegetable oil that brings with it the known mutagenic compounds of the kind I’ve just described. Furthermore, if exposure to these vegetable oil-derived mutagens causes a breakdown in the systems for accurately duplicating genes, then you might expect to find other detrimental effects from this generalized defect of gene replication. Indeed we do. Researchers in Finland have found that children anywhere on the ASD spectrum have between 1.5 and 2.7 times the risk of being born with a serious birth defect, most commonly a life-threatening heart defect or neural tube (brain and spinal cord) defect that impairs the child’s ability to walk. 436 Another group, in Nova Scotia, identified a similarly increased rate of minor malformations, such as abnormally rotated ears, small feet, or closely spaced eyes. 437
What I’ve laid out here is the argument that the increasing prevalence of autism is best understood as a symptom of De Novo Gene Mutation Syndrome brought on by oxidative damage, and that vegetable oil is the number-one culprit in creating these new mutations. These claims emerge from a point-by-point deduction based on the best available chemical, genetic, and physiologic science. To test the validity of this hypothesis, we need more research.
Does De Novo Gene Mutation Syndrome Affect Just the Brain?
Nothing would redirect the trajectory of autism research in a more productive fashion than reframing autism as a symptom of the larger underlying disease, which we are provisionally calling de novo gene-mutation syndrome, or DiNGS. (Here’s a mnemonic: vegetable oil toxins “ding” your DNA, like hailstones pockmarking your car.)
If you accept my thesis that the expanding epidemic of autism is a symptom of an epidemic of new gene mutations, then you may wonder why the only identified syndrome of DiNGS is autism. Why don’t we see all manner of new diseases associated with gene mutations affecting organs other than the brain? We do. According to the most recent CDC report on birth defect incidence in the United States, twenty-nine of the thirty-eight organ malformations tracked have increased. 438
However, these are rare events, occurring far less frequently than autism. The reason for the difference derives from the fact that the brain of a developing baby can be damaged to a greater degree than other organs can, while still allowing the pregnancy to carry to term. Though the complex nature of the brain makes it the most vulnerable in terms of being affected by mutation, this aberration of development does not make the child more vulnerable in terms of survival in utero. The fact that autism affects the most evolutionarily novel portion of the brain means that as far as viability of an embryo is concerned, it’s almost irrelevant. If the kinds of severely damaging mutations leading to autism were to occur in organs such as the heart, lungs, or kidneys, fetal survival would be imperiled, leading to spontaneous miscarriage. Since these organs begin developing as early as four to six weeks of in-utero life, failure of a pregnancy this early might occur without any symptoms other than bleeding, which might be mistaken for a heavy or late period, and before a mother has even realized she’s conceived.
* * *
Rhonda Patrick’s view is similar to that of Shanahan:
“In the famous “Pottenger Cat Study” it took one generation for the deterioration to occur when diet was inadequate and a further three generations on an optimal diet for their physical condition to return to that of the original cats…”
~Effect of western diet on facial and dental development
It’s common to blame individuals for the old Christian sins of sloth and gluttony. But that has never made much sense, at least not scientifically. Gary Taubes has discussed this extensively, and so look to his several books for more info about why applying Christian theology to diet, nutrition, and health is not a wise strategy for evidence-based medicine and public health policy.
Yes, Americans in particular would be wise to do something about their health in a society where 88% of the adult population has one or more symptoms of metabolic syndrome with about three-quarters being overweight and about half diabetic or prediabetic (Joana Araújo, Jianwen Cai, June Stevens. “Prevalence of Optimal Metabolic Health in American Adults: National Health and Nutrition Examination Survey 2009–2016”; for more info, see The University of North Carolina at Chapel Hill or Science Daily). Consider, these statistics are even worse for the younger generations. But let’s put this in even greater context. It’s not only that each generation is unhealthier than the last for this declining health is being inherited from before birth. There is now an obesity epidemic among 6 month old babies. I doubt anyone thinks it’s reasonable to blame babies. Should babies eat less and exercise more?
This goes back a while. European immigrants in the early 1900s noticed how American children were much chubbier than their European counterparts. By the 1950s, there was already a discussion of an obesity epidemic, as it was becoming noticeable with the younger generations. We are several generations into this modern industrialized diet of highly processed starchy carbs, added sugar, and seed oils. Much of this is caused by worsening environmental conditions, from harmful chemicals to industrial food system. The effects would begin in the womb, but the causality can actually extend across numerous generations.
This is called epigenetics, what determines which genes get expressed and how. And this epigenetic effect is magnified by the microbiome we inherit as well, since microbes help determine some of the epigenetic effect, involving short-chain fatty acids that can be obtained either through plant or animal foods (Fiber or Not: Short-Chain Fatty Acids and the Microbiome). This is important, as it is easier and more straightforward to manipulate our microbiome than our epigenetics, or at least our knowledge is more clear about the former. By changing our diet, we can change our microbiome. And by changing our microbiome, we can change our epigenetics and that of our children and grandchildren.
The dietary aspect is the most basic component, in that some diets seem to have an effect directly on the epigenome itself, however the microbiome may or may not be involved — for example, there is “recent evidence that KD [ketogenic diet] influences the epigenome through modulation of adenosine metabolism as a plausible antiepileptogenic mechanism of the diet” (Theresa A. Lusardi & Detlev Boison, Ketogenic Diet, Adenosine, Epigenetics, and Antiepileptogenesis). It’s been proven for about a century now that the ketogenic diet is the most effective treatment for epileptic seizures, but there has been much debate about why. Now we might know the reason. The mechanism appears to be epigenetic.
This is not exactly new knowledge (Health From Generation To Generation). Such cross-generational influences have been known since earlier last century, but sadly such knowledge is not epigenetically inherited by each succeeding generation. Francis M. Pottenger Jr studied the health of cats on severely malnourished and well-nourished diets — by the third generation the malnourished cats were no longer capable of breeding and so there was no fourth generation. This doesn’t perfectly translate to the present human diet, although it does make one wonder. Many of our diseases of civilization seem to be at least partly caused by malnourishment. This is a public health epidemic as national security crisis.
Here is the question that comes to mind: In this modern industrialized diet, what generation of malnourishment are we at now? And if as a society we changed public health policies and medical practice right now, how many generations would it take to reverse the trend and fully undo the damage? To end on a positive note, we could potentially turn it around within this century: “Dr. Pottenger’s research also showed that the health of the cats could be recovered if the diet were returned to a healthy one by the second generation; however, even then it took four generations for some of the cats to show no symptoms of allergies” (Carolyn Biggerstaff, Pottenger’s Cats – an early window on epigenetics).
So, what are we waiting for?
* * *
To give you some idea of how long our society has experienced declining health, check out some of my earlier posts:
“Moore and her colleagues investigated whether C. elegans can convey this learned avoidance behavior to their progeny. They found that when mother worms learned to avoid pathogenic P. aeruginosa, their progeny also knew to avoid the bacteria. The natural attraction of offspring to Pseudomonas was overridden even though they had never previously encountered the pathogen. Remarkably, this inherited aversive behavior lasted for four generations, but in the fifth generation the worms were once again attracted to Pseudomonas.”
This is not an entirely new understanding. Earlier research has found similar results in other species. The study that always fascinates me had to do with rodents. The scent of cherry blossoms was emitted in their cage and immediately following that the bottom of the cage was electrified. Unsurprisingly, the rodents jumped around trying to avoid the pain. The rodents learned to begin jumping merely at the presence of the scent, whether or not any electric shock followed. The interesting part is that their rodent descendants, even though never shocked, would also jump when they smelled cherry blossoms. And this lasted for multiple generations. A very specific learned behavior was passed on.
Of course, this isn’t limited to worms and rodents. Humans are harder to study, partly because of our longer lives. But researchers have been able to observe multiple living generations to discover patterns. I’m not sure if this exactly fits into learned behavior, except in how the body learns to respond to the environment. It’s similar enough. This other research found that the children and grandchildren of famine survivors had higher rates of obesity that had nothing to do wasn’t caused by genetics or diet. It is what is called epigenetics, how the genes get set for expression. The same genes can be switched on or off in numerous ways in relation to other genes.
I find that fascinating. It also makes for much complication. Almost no research ever controls for multigenerational confounding factors. Epigenetics has been largely a black box, until quite recently. To be certain that a particular behavior was directly related to specific genetics in a population, you would have to be able to follow that population for many generations. To fully control for confounders, that would require a study that lasted more than a century. It might turn out that much of what we call ‘culture’ might more correctly be explained as population-wide epigenetics.
* * *
As a side note, this would have immense significance to dietary and nutritional research. Many of the dietary changes that have happened in modern society are well within the range of epigenetic involvement. And the epigenetic effects likely would be cumulative.
We have an ongoing and uncontrolled experiment going on. No one knows the long-term consequences of the modern industrial diet of refined carbohydrates, added sugars, highly processed vegetable oils, food additives, farm chemicals, microplastic, etc. It’s a mass experiment and the subjects never chose to participate.
Definitely, we have reasons to be concerned. Francis M. Pottenger Jr. studied the dietary impact on feline health. He fed some cats a raw food diet, others a cooked food diet, and a third group with a diet mixed of raw and cooked. The cats on the cooked food diet became sickly in the first generation and were entirely infertile after a number of generations.
This is not exactly similar to the human diet of industrial foods. But it points to how results play out across generations. The worst effects aren’t necessarily seen in the immediate generation(s). It’s future generations that have to deal with what those before them caused, as true for epigenetics as it is for national debt and environmental destruction.
Epignetics is what determines which genes express and how they express. Research on epigenetics for some reason has often focused on negative consequences.
In rodent research, scientists were able to induce a Pavlovian response to a smell that preceded a shock. The rodents would jump when the smell was present, even when no shock followed. And generations of rodents kept jumping, despite their never having been shocked at all. The Pavlovian response was inherited. In human research, scientists studied populations that had experienced famine. They looked at multiple generations where only the older generation had been alive during the famine. Yet all the generations following had higher rates of obesity. They inherited the biological preparation for famine.
One might start to think that epigenetics is a bad thing, almost like a disease. But that would be a mistake. Everything about who we are, good and bad, is shaped by epigenetics. To balance things out, I just came across some a more positive example. Health benefits get passed on as well. I would note, however, that this is what exacerbates inequality. This is why oppression and privilege get inherited not only through social conditions but in biology itself. This is all the more reason we should intervene to create the most optimal conditions for everyone, not merely the fortunate few.
This is why the political left emphasizes equality of results, beyond theoretical equality of opportunity. Opportunity is meaningless if it remains an abstract ideal disconnected from lived reality for most of the population. Telling people to get over the past is cruel and ignorant. The past is never past and, in fact, becomes imprinted upon the bodies of many generations, maybe across centuries. Historical injustices and transgenerational trauma are what our society are built upon, and much of it is within living memory, from the Indian Wars to Jim Crow.
It will require direct action to undo the damage and to promote the public good. That is the only path toward a free and fair society.
Physical exercise is well known for its positive effects on general health (specifically, on brain function and health), and some mediating mechanisms are also known. A few reports have addressed intergenerational inheritance of some of these positive effects from exercised mothers or fathers to the progeny, but with scarce results in cognition. We report here the inheritance of moderate exercise-induced paternal traits in offspring’s cognition, neurogenesis, and enhanced mitochondrial activity. These changes were accompanied by specific gene expression changes, including gene sets regulated by microRNAs, as potential mediating mechanisms. We have also demonstrated a direct transmission of the exercise-induced effects through the fathers’ sperm, thus showing that paternal physical activity is a direct factor driving offspring’s brain physiology and cognitive behavior.
Physical exercise has positive effects on cognition, but very little is known about the inheritance of these effects to sedentary offspring and the mechanisms involved. Here, we use a patrilineal design in mice to test the transmission of effects from the same father (before or after training) and from different fathers to compare sedentary- and runner-father progenies. Behavioral, stereological, and whole-genome sequence analyses reveal that paternal cognition improvement is inherited by the offspring, along with increased adult neurogenesis, greater mitochondrial citrate synthase activity, and modulation of the adult hippocampal gene expression profile. These results demonstrate the inheritance of exercise-induced cognition enhancement through the germline, pointing to paternal physical activity as a direct factor driving offspring’s brain physiology and cognitive behavior.
In this jungle of invading viruses, undead pseudogenes, shuffled exons and epigenetic marks, can the classical concept of the gene survive? It is an open question, one that Dr. Prohaska hopes to address at a meeting she is organizing at the Santa Fe Institute in New Mexico next March.
In the current issue of American Scientist, Dr. Gerstein and his former graduate student Michael Seringhaus argue that in order to define a gene, scientists must start with the RNA transcript and trace it back to the DNA. Whatever exons are used to make that transcript would constitute a gene. Dr. Prohaska argues that a gene should be the smallest unit underlying inherited traits. It may include not just a collection of exons, but the epigenetic marks on them that are inherited as well.
These new concepts are moving the gene away from a physical snippet of DNA and back to a more abstract definition. “It’s almost a recapture of what the term was originally meant to convey,” Dr. Gingeras said.
A hundred years after it was born, the gene is coming home.
This complex interweaving of genes, transcripts, and regulation makes the net effect of a single mutation on an organism much more difficult to predict, Gingeras says.
More fundamentally, it muddies scientists’ conception of just what constitutes a gene. In the established definition, a gene is a discrete region of DNA that produces a single, identifiable protein in a cell. But the functioning of a protein often depends on a host of RNAs that control its activity. If a stretch of DNA known to be a protein-coding gene also produces regulatory RNAs essential for several other genes, is it somehow a part of all those other genes as well?
To make things even messier, the genetic code for a protein can be scattered far and wide around the genome. The ENCODE project revealed that about 90 percent of protein-coding genes possessed previously unknown coding fragments that were located far from the main gene, sometimes on other chromosomes. Many scientists now argue that this overlapping and dispersal of genes, along with the swelling ranks of functional RNAs, renders the standard gene concept of the central dogma obsolete.
Long Live The Gene
Offering a radical new conception of the genome, Gingeras proposes shifting the focus away from protein-coding genes. Instead, he suggests that the fundamental units of the genome could be defined as functional RNA transcripts.
Since some of these transcripts ferry code for proteins as dutiful mRNAs, this new perspective would encompass traditional genes. But it would also accommodate new classes of functional RNAs as they’re discovered, while avoiding the confusion caused by several overlapping genes laying claim to a single stretch of DNA. The emerging picture of the genome “definitely shifts the emphasis from genes to transcripts,” agrees Mark B. Gerstein, a bioinformaticist at Yale University.
Scientists’ definition of a gene has evolved several times since Gregor Mendel first deduced the idea in the 1860s from his work with pea plants. Now, about 50 years after its last major revision, the gene concept is once again being called into question.
Over the years, however, what scientists might consider “a lot” in this context has quietly inflated. Last June, Pritchard and his Stanford colleagues Evan Boyle and Yang Li (now at the University of Chicago) published a paper about this in Cell that immediately sparked controversy, although it also had many people nodding in cautious agreement. The authors described what they called the “omnigenic” model of complex traits. Drawing on GWAS analyses of three diseases, they concluded that in the cell types that are relevant to a disease, it appears that not 15, not 100, but essentially all genes contribute to the condition. The authors suggested that for some traits, “multiple” loci could mean more than 100,000. […]
For most complex conditions and diseases, however, she thinks that the idea of a tiny coterie of identifiable core genes is a red herring because the effects might truly stem from disturbances at innumerable loci — and from the environment — working in concert. In a new paper out in Cell this week, Wray and her colleagues argue that the core gene idea amounts to an unwarranted assumption, and that researchers should simply let the experimental data about particular traits or conditions lead their thinking. (In their paper proposing omnigenics, Pritchard and his co-authors also asked whether the distinction between core and peripheral genes was useful and acknowledged that some diseases might not have them.)
Epigenetics is fascinating, even bizarre by conventional thought. Some worry that it’s another variety of determinism, just not located in the genes. I have other worries, if not that particular one.
How epigenetics work is that a gene gets switched on or off. The key point is that it’s not permanently set. Some later incident, conditions, behavior, or whatever can switch it back the other way again. Genes in your body are switched on and off throughout your lifetime. But presumably if no significant changes occur in one’s life some epigenetic expressions remain permanently set for your entire life.
Where it gets fascinating is that it’s been proven that epigenetics gets passed on across multiple generations and no one is certain how many generations. In mice, it can extend at least upwards of 7 generations or so, as I recall. Humans, of course, haven’t been studied for that many generations. But present evidence indicates it operates similarly in humans.
Potentially, all of the major tragedies in modern history (violence of colonialism all around the world, major famines in places like Ireland and China, genocides in places like the United States and Rwanda, international conflicts like the world wars, etc), all of that is within the range of epigenetis. It’s been shown that famine, for example, switches genes for a few generations that causes increased fat retention and in the modern world that means higher obesity rates.
I’m not sure what is the precise mechanism that causes genes to switch on and off (e.g., precisely how does starvation get imprinted on biology and become set that way for multiple generations). All I know is it has to do with the proteins that encase the DNA. The main interest is that, once we do understand the mechanism, we will be able to control the process. This might be a way of preventing or managing numerous physical and psychiatric health conditions. So, it really will mean the opposite of determinism.
This research reminds me of other scientific and anecdotal evidence. Consider the recipients of organ transplants, blood and bone marrow transfusions, and microbiome transference. This involves the exchange of cells from one body to another. The results have shown changes in mood, behavior, biological functioning, etc
For example, introducing a new microbiome can make a skinny rodent fat or a fat rodent skinny. But also observed are shifts in fairly specific memories, such as an organ transplant recipient craving something the organ donor craved. Furthermore, research has shown that genetics can jump from the introduced cells to the already present cells, which is how a baby can potentially end up with the cells of two fathers if a previous pregnancy was by a different father, and actually it’s rather common for people to have multiple DNAs in their body.
It intuitively makes sense that epigenetics would be behind memory. It’s easy to argue that there is no other function in the body that has this kind and degree of capacity. And that possibility would blow up our ideas of the human mind. In that case, some element of memories would get passed on multiple generations, explaining certain similarities seen in families and larger populations with shared epigenetic backgrounds.
This gives new meaning to the theories of both the embodied mind and the extended mind. There might also having some interesting implications for the bundle theory of mind. I wonder too about something like enactivism which is about the human mind’s relation to the world. Of course, there are obvious connections of this specific research with neurological plasticity and of epigenetics more generally with intergenerational trauma.
So, it wouldn’t only be the symptoms of trauma or else the benefits of privilege (or whatever other conditions that shape individuals, generational cohorts, and sub-populations) being inherited but some of the memory itself. This puts bodily memory in a much larger context, maybe even something along the lines of Jungian thought, in terms of collective memory and archetypes (depending on how long-lasting some epigenetic effects might be). Also, much of what people think of as cultural, ethnic, and racial differences might simply be epigenetics. This would puncture an even larger hole in genetic determinism and race realism. Unlike genetics, epigenetics can be changed.
Our understanding of so much is going to be completely altered. What once seemed crazy or unthinkable will become the new dominant paradigm. This is both promising and scary. Imagine what authoritarian governments could do with this scientific knowledge. The Nazis could only dream of creating a superman. But between genetic engineering and epigenetic manipulations, the possibilities are wide open. And right now, we have no clue what we are doing. The early experimentation, specifically research done covertly, is going to be of the mad scientist variety.
These interesting times are going to get way more interesting.
The finding is surprising because it suggests that a nerve cell body “knows” how many synapses it is supposed to form, meaning it is encoding a crucial part of memory. The researchers also ran a similar experiment on live sea slugs, in which they found that a long-term memory could be totally erased (as gauged by its synapses being destroyed) and then re-formed with only a small reminder stimulus—again suggesting that some information was being stored in a neuron’s body.
Synapses may be like a concert pianist’s fingers, explains principal investigator David Glanzman, a neurologist at U.C.L.A. Even if Chopin did not have his fingers, he would still know how to play his sonatas. “This is a radical idea, and I don’t deny it: memory really isn’t stored in synapses,” Glanzman says.
Other memory experts are intrigued by the findings but cautious about interpreting the results. Even if neurons retain information about how many synapses to form, it is unclear how the cells could know where to put the synapses or how strong they should be—which are crucial components of memory storage. Yet the work indeed suggests that synapses might not be set in stone as they encode memory: they may wither and re-form as a memory waxes and wanes. “The results are really just kind of surprising,” says Todd Sacktor, a neurologist at SUNY Downstate Medical Center. “It has always been this assumption that it’s the same synapses that are storing the memory,” he says. “And the essence of what [Glanzman] is saying is that it’s far more dynamic.”
Glanzman’s experiments—funded by the National Institutes of Health and the National Science Foundation—involved giving mild electrical shocks to the marine snail Aplysia californica. Shocked snails learn to withdraw their delicate siphons and gills for nearly a minute as a defense when they subsequently receive a weak touch; snails that have not been shocked withdraw only briefly.
The researchers extracted RNA from the nervous systems of snails that had been shocked and injected the material into unshocked snails. RNA’s primary role is to serve as a messenger inside cells, carrying protein-making instructions from its cousin DNA. But when this RNA was injected, these naive snails withdrew their siphons for extended periods of time after a soft touch. Control snails that received injections of RNA from snails that had not received shocks did not withdraw their siphons for as long.
“It’s as if we transferred a memory,” Glanzman said.
Glanzman’s group went further, showing that Aplysia sensory neurons in Petri dishes were more excitable, as they tend to be after being shocked, if they were exposed to RNA from shocked snails. Exposure to RNA from snails that had never been shocked did not cause the cells to become more excitable.
The results, said Glanzman, suggest that memories may be stored within the nucleus of neurons, where RNA is synthesized and can act on DNA to turn genes on and off. He said he thought memory storage involved these epigenetic changes—changes in the activity of genes and not in the DNA sequences that make up those genes—that are mediated by RNA.
This view challenges the widely held notion that memories are stored by enhancing synaptic connections between neurons. Rather, Glanzman sees synaptic changes that occur during memory formation as flowing from the information that the RNA is carrying.
The original meaning of a gene was simply a heritable unit. This was long before the discovery of DNA. The theory was based on phenotype, i.e., observable characteristics. What they didn’t know and what still doesn’t often get acknowledged is that much gets inherited from parents, especially from the mother. This includes everything from epigenetics to microbiome, the former determining which genes express and how they express while the latter consists of the majority of genetics in the human body. The fetus will also inherit health conditions from the mother, such as malnutrition and stress, viruses and parasites — all of those surely having epigenetic effects and microbiome changes that could get passed on for generations.
Even more interestingly, DNA itself gets passed on in diverse ways. Viruses will snip out sections of DNA and then put them into the DNA of new hosts. Mothers, including surrogate mothers, can gain DNA from the fetuses they carry. And then those mothers can pass that DNA to any fetus she carries after that, which could cause a fetus to have DNA from two fathers. Fetuses can also absorb the DNA from fraternal twins or even entirely absorb the other fetus, forming what is called a chimera. Bone marrow transplantees also become chimeras because they inherit the stem cells for blood cells from the donor, along with inheriting epigentics from the donor. These chimeras could pass this on during a transplantee’s pregnancy.
We hardly know what all that might mean. There is no single heritable unit that by itself does anything. That is not the direct source of causation. A gene only acts as part of DNA within a specific cell and all of that within the entire biological system existing within specific environmental conditions. The most important causal factors are various. What is in DNA only matters to the degree it is expressed, but what determines its expression will also determine how it expresses. Evelyn Keller Fox writes that, “the causal interactions between DNA, proteins, and trait development are so entangled, so dynamic, and so dependent on context that the very question of what genes do no longer makes much sense. Indeed, biologists are no longer confident that it is possible to provide an unambiguous answer to the question of what a gene is. The particulate gene is a concept that has become increasingly ambiguous and unstable, and some scientists have begun to argue that the concept has outlived its productive prime” (The Mirage of a Space between Nature and Nurture, p. 50). Gene expression as seen in phenotype is determined by a complex system of overlapping factors. Talk of genes doesn’t help us much, if at all. And heritability rates tells us absolutely nothing about the details, such as distinguishing what exactly is a gene as a heritable unit and causal factor, much less differentiating that from everything else. As Fox further explains:
“It is true that many authors continue to refer to genes, but I suspect that this is largely due to the lack of a better terminology. In any case, continuing reference to “genes” does not obscure the fact that the early notion of clearly identifiable, particulate units of inheritance— which not only can be associated with particular traits, but also serve as agents whose actions produce those traits— has become hopelessly confounded by what we have learned about the intricacies of genetic processes. Furthermore, recent experimental focus has shifted away from the structural composition of DNA to the variety of sequences on DNA that can be made available for (or blocked from) transcription— in other words, the focus is now on gene expression. Finally, and relatedly, it has become evident that nucleotide sequences are used not only to provide transcripts for protein synthesis, but also for multilevel systems of regulation at the level of transcription, translation, and posttranslational dynamics. None of this need impede our ability to correlate differences in sequence with phenotypic differences, but it does give us a picture of such an immensely complex causal dynamic between DNA, RNA, and protein molecules as to definitely put to rest all hopes of a simple parsing of causal factors. Because of this, today’s biologists are far less likely than their predecessors were to attribute causal agency either to genes or to DNA itself— recognizing that, however crucial the role of DNA in development and evolution, by itself, DNA doesn’t do anything. It does not make a trait; it does not even encode a program for development. Rather, it is more accurate to think of DNA as a standing resource on which a cell can draw for survival and reproduction, a resource it can deploy in many different ways, a resource so rich as to enable the cell to respond to its changing environment with immense subtlety and variety. As a resource, DNA is indispensable; it can even be said to be a primary resource. But a cell’s DNA is always and necessarily embedded in an immensely complex and entangled system of interacting resources that are, collectively, what give rise to the development of traits. Not surprisingly, the causal dynamics of the process by which development unfolds are also complex and entangled, involving causal influences that extend upward, downward, and sideways.” (pp. 50-52)
Even something seemingly as simple as gender is far from simple. Claire Ainsworth has a fascinating piece, Sex redefined (nature.com), where she describes the new understanding that has developed. She writes that, “Sex can be much more complicated than it at first seems. According to the simple scenario, the presence or absence of a Y chromosome is what counts: with it, you are male, and without it, you are female. But doctors have long known that some people straddle the boundary — their sex chromosomes say one thing, but their gonads (ovaries or testes) or sexual anatomy say another. Parents of children with these kinds of conditions — known as intersex conditions, or differences or disorders of sex development (DSDs) — often face difficult decisions about whether to bring up their child as a boy or a girl.”
This isn’t all that rare considering that, “Some researchers now say that as many as 1 person in 100 has some form of DSD.” And, “What’s more, new technologies in DNA sequencing and cell biology are revealing that almost everyone is, to varying degrees, a patchwork of genetically distinct cells, some with a sex that might not match that of the rest of their body. Some studies even suggest that the sex of each cell drives its behaviour, through a complicated network of molecular interactions. Gender should be one of the most obvious areas to prove genetic determinism, if it could be proven. But clearly there is more going on here. The inheritance and expression of traits is a messy process. And we are barely scratching the surface. I haven’t seen any research that explores how epigenetics, microbiome, etc could influence gender or similar developmental results.
“If memory is not located in the synapse, then where is it? When the neuroscientists took a closer look at the brain cells, they found that even when the synapse was erased, molecular and chemical changes persisted after the initial firing within the cell itself. The engram, or memory trace, could be preserved by these permanent changes. Alternatively, it could be encoded in modifications to the cell’s DNA that alter how particular genes are expressed. Glanzman and others favor this reasoning.”
“Now a fascinating new study reveals that it’s not just nurture. Traumatic experiences can actually work themselves into the germ line. When a male mouse becomes afraid of a specific smell, this fear is somehow transmitted into his sperm, the study found. His pups will also be afraid of the odor, and will pass that fear down to their pups.”
You must be logged in to post a comment.