Medical-Industrial Complex

“Unless we put medical freedom into the Constitution, the time will come when medicine will organize into an undercover dictatorship…To restrict the art of healing to one class of men and deny equal privileges to others will constitute the Bastille of medical science. All such laws are un-American and despotic…, and have no place in a republic…The Constitution of this Republic should make special provisions for medical freedom as well as religious freedom.”

Dr. Benjamin Rush, signer of Declaration of Independence, member of Continental Congress

“The efforts of the medical profession in the US to control:…its…job it proposes to monopolize. It has been carrying on a vigorous campaign all over the country against new methods and schools of healing because it wants the business…I have watched this medical profession for a long time and it bears watching.”

Clarence Darrow (1857-1938), Populist leader and lawyer

“Medicine is a social science and politics is a medicine on a large scale…The very words ‘Public Health’ show those who are of the opinion that medicine has nothing to do with politics the magnitude of their error.”

Rudolf Virchow, (1821-1902) founder of cellular pathology

“The profession to which we belong, once venerated…-has become corrupt and degenerate to the forfeiture of its social position…”

Dr. Nathaniel Chapman, first president, AMA, 1848

In 1922, Herbert McLean Evans and Katharine Scott Bishop discovered vitamin E. Then in the following decades from the 1930s to the 1940s, Drs. Wilfred and Evan Shute treated 30,000 patients with natural vitamin E in their clinic and studied it’s health benefits. Despite all of the documented evidence, they had little influence in mainstream nutrition and medicine. They had the disadvantage of promoting a vitamin right at the beginning of the era when pharmaceuticals were getting all of the attention: “Better Living through chemistry.” Responding to the resistance of medical authorities, from his book The Heart and Vitamin E (1956), Dr. Evans Shute wrote that,

“It was nearly impossible now for anyone who valued his future in Academe to espouse Vitamin E, prescribe it or advise its use. That would make a man a “quack” at once. This situation lasted for many years. In the United States, of course, the closure of the JAMA pages against us and tocopherol meant that it did not exist. It was either in the U.S. medical bible or it was nought. No amount of documentation could budge medical men from this stance. Literature in the positive was ignored and left unread. Individual doctors often said: ‘If it is as good as you say, we would all be using it.’ But nothing could induce them as persons of scientific background to make the simplest trial on a burn or coronary.”

In the article Drs. Wilfrid and Evan Shute Cured Thousands with Vitamin E, Andrew W. Saul emphasized this suppression of new knowledge:

“The American Medical Association even refused to let the Shute’s present their findings at national medical conventions. (p 148-9) In the early 1960’s, the United States Post Office successfully prevented even the mailing of vitamin E. (p 166).” Over the decades, others have taken note of the heavy-handedness of mainstream authorities. “The failure of the medical establishment during the last forty years,” wrote Linus Pauling in his 1985 Foreword, “to recognize the value of Vitamin E in controlling heart disease is responsible for a tremendous amount of unnecessary suffering and for many early deaths. The interesting story of the efforts to suppress the Shute discoveries about Vitamin E illustrates the shocking bias of organized medicine against nutritional measures for achieving improved health.”

What is motivating this ‘failure’? And is it really a failure or simply serving other interests, maybe quite successfully at that?

* * *

“Today, expulsion is again mustered into service in a war of ideology. …Modern society makes its heresies out of political economy…Ethics has always been a flexible, developing notion of medicine, with a strong flavor of economics from the start.”

Oliver Garceau, Dept. of Government, Harvard U., The Political Life of the AMA (1941)

“Everyone’s heard about the military-industrial complex, but they know very little about the medical-industrial complex…(in) a medical arms race…”

California Governor Jerry Brown, June 1980

“The new medical-industrial complex is now a fact of American life…with broad and potentially troubling implications…”

Dr. Arnold Relman, Editor, New England Journal of Medicine

“Bankers regard research as most dangerous and a thing that makes banking hazardous due to the rapid changes it brings about in industry.”

Charles Kettering, of Memorial Sloan Kettering Cancer Center, and Vice President of General Motors, (in Ralph Moss, Cancer Syndrome)

“The system of influence and control..is highly skewed in favor of the corporate and financial system. And this dominant influence is felt not only in universities, foundations, and institutions of higher learning, but also…from media to all other instruments of communication.”

Vincente Navarro, (Professor of Health and Social Policy, John Hopkins U., and other credentials).

“In the feeding of hospital patients, more attention should be given to providing tasty and attractive meals, and less to the nutritive quality of the food.”
“People say that all you get out of sugar is calories, no nutrients…There is no perfect food, not even mother’s milk.”
“Have confidence in America’s food industry, it deserves it.”

Dr. Frederick Stare, Harvard U. School of Public Health, Nutrition Dept. Head

So, why are the powers that be so concerned with harmless supplements that consumers take in seeking self-healing and well-being? The FDA explained it’s motivativions:

“It has been common…to combine such unproven ingredients as bio-flavinoids, rutin…, with such essential nutrients as Vitamin C…, thus implying that they are all nutritionally valuable for supplementation of the daily diet. The courts have sustained FDA legal action to prevent such practices, and the new FDA regulations preclude this type of combination in the future…Similarly, it has been common…to state or imply that the American diet is inadequate because of soil deficiencies, commercial processing methods, use of synthetic nutrients, and similar charges. FDA recognizes that these false statements have misled, scared, and confused the public, and is prohibiting any such general statements in the future…The medical and nutritional professions have shown strong support of this policy,…” (FDA Assistant General council’s letter to 5 US Legislators, Hearings, US Congress, 1973).

To give a further example of this contorted thinking, consider another statement from an FDA official: “It is wholly unscientific to state that a well-fed body is more able to resist disease than a less well-fed body” (FDA’s Head of Nutrition Department, Dr. Elmer M. Nelson. in Gene Marin and Judith Van Allen, Food Pollution: The Violation of Our Inner Ecology). That is so absurd as to be unbelievable. Yet it’s sadly expected when one knows of incidents like Ancel Keys attack on John Yudkin amidst wholesale silencing of his detractors and the more recent high level persecution of Tim Noakes, along with dozens of other examples.

The advocates of natural healing and sellers of nutritional supplements were criticizing the dominant system of big ag, big drug, and closely related industries. This was a challenge to power and profit, and so it could not be tolerated. One wouldn’t want the public to get confused… nor new generations of doctors, as explained the Harvard Medical School Dean, Dr. David Edsall: “…students were obliged…to learn about an interminable number of drugs, many…valueless, …useless, some…harmful. …there is less intellectual freedom in the medical course than in almost any other form of professional education in this country.”

This is how we end up with young doctors, straight out of medical school, failing a basic test on nutrition (Most Mainstream Doctors Would Fail Nutrition). Who funds much of the development of medical school curruicula? Private corporations, specifically big drug and big food, and the organizations that represent them. Once out of medical school, some doctors end up making millions of dollars by working for industry on the side, such as giving speeches to promote pharmaceuticals. Also, continuing education and scientific conferences are typically funded by this same big money from the private sphere. There is a lot of money slushing around, not to mention the small briberies of free vacations and such given to doctors. It’s a perverse incentive and one that was carefully designed to manipulate and bias the entire healthcare system.

* * *

“[Doctors] collectively have done more to block adequate medical care for people of this country than any other single group.”

President Jimmy Carter

“I think doctors care very deeply about their patients, but when they organize into the AMA, their responsibility is to the welfare of doctors, and quite often, these lobbying groups are the only ones that are heard in the state capitols and in the capitol of our country.”

President Jimmy Carter

“The FDA and much, but not all, of the orthodox medical profession are actively hostile against vitamins and minerals… They are out to get the health food industry…And they are trying to do this out of active hostility and prejudice.”

Senator William Proxmire (in National Health Federation Bulletin, April, 1974

“Eminent nutritionists have traded their independence for the food industry’s favors.”

US Congressman Benjamin Rosenthal

“The problem with ‘prevention’ is that it does not produce revenues. No health plan reimburses a physician or a hospital for preventing a disease.”

NCI Deputy Director, Division of Cancer Cause and Prevention; and of Diet, Nutrition and Cancer Program

“What is the explanation for the blind eye that has been turned on the flood of medical reports on the causative role of carbohydrates in overweight, ever since the publication in 1864 of William Banting’s famous “Letter on Corpulence”? Could it be related, in part, to the vast financial endowments poured into the various departments of nutritional education by the manufacturers of our refined carbohydrate foodstuff?”

Robert C. Atkins, MD, Dr. Atkins Diet Revolution, c. 1972

“Although the stated purpose of licensure is to benefit the public…Consumers…have learned that licensing may add to the cost of services, while not assuring quality….Charges…the legal sector that licensure restricts competition, and therefore unnecessarily increases costs to consumers….Like other professionals, dietiticians can justify the enactment of licensure laws because licensing affords the opportunity to protect dietiticians from interference in their field by other practitioners…This protection provides a competitive advantage, and therefore is economically beneficial for dietiticians”

ADA President, Marilyn Haschske, JADA, 1984

“While millions of dollars were being projected for research on radiation and other cancer ‘cures’, there was an almost complete blackout on research that might have pointed to needed alterations in our environment, our industrial organization, and our food.”

Carol Lopate, in Health Policy Advisory Center, Health PAC Bulletin

“Research in the US has been seriously affected by restrictions imposed by foreign cartel members. …It has attempted to suppress the publication of scientific research data which were at variance with its monopoly interest. …The hostility of cartel members toward a new product which endangers their control of the market(:)…In the field of synthetic hormones, the cartel control has been …detrimental to our national interest.”

US Assistant Attorney General, Wendell Berge, Cartels, Challenge to the Free World. – in Eleanor McBean, The Poisoned Needle

“We are aware of many cases in industry, government laboratories, and even universities where scientists have been retaliated against when their professional standards interfered with the interests of their employers or funders. This retaliation has taken many forms, ranging from loss of employment and industry-wide blacklisting to transfers and withholding of salary increases and promotions. We are convinced that the visible problem is only the tip of the iceberg.”

American Chemical Society President, Alan C. Nixon, (in Science, 1973)

Similar to the struggles of the Shute brothers, this problem was faced faced by the early scientists studying the ketogenic diet and the early doctors using it to treat patients with epilepsy. The first research and application of the ketogenic diet began in the 1920s and it was quickly found useful for other health conditions. But after a brief period of interest and funding, the research was mostly shut down in favor of the emerging new drugs that could be patented and marketed. It was irrelevant that the keto diet was far more effective than any drugs produced then or since. The ketogenic diet lingered on in a few hospitals and clinics, until research was revived in the 1990s, about three-quarters of a century later. Yet, after hundreds of studies proving its efficacy for numerous diseases (obesity, diabetes, multiple sclerosis, Alzheimer’s, etc), mainstream authority figures and the mainstream media continue to dismiss it and spread fear-mongering, such as false and ignorant claims about ketoacidosis and kidney damage.

Also, consider X-ray technology that was invented by Dr. Émil Herman Grubbé in 1896. He then became the first to use X-rays for cancer treatment. Did the medical profession embrace this great discovery? Of course not. It wasn’t acknowledged as useful until 1951. When asked what he thought about this backward mentality denying such a profound discovery, Dr. Grubbé didn’t mince words: “The surgeons. They controlled medicine, and they regarded the X-ray as a threat to surgery. At that time surgery was the only approved method of treating cancer. They meant to keep it the ‘only’ approved method by ignoring or rejecting any new methods or ideas. This is why I was called a ‘quack’ and nearly ejected from hospitals where I had practiced for years” (Herbert Bailey, Vitamin E: Your Key to a Healthy Heart). As with the Shute brothers, he was deemed a ‘quack’ and so case closed.

There have been many more examples over the past century, in particular during the oppressive Cold War era (Cold War Silencing of Science). The dominant paradigm during McCarthyism was far from limited to scapegoating commies and homosexuals. Anyone stepping out of line could find themselves targeted by the powerful. This reactionary impulse goes back many centuries and continues to exert its influence to this day, continues to punish those who dare speak out (Eliminating Dietary Dissent). This hindering of innovation and progress may be holding civilization back by centuries. We seem unable of dealing with the simplest of problems, even when we already have the knowledge of how to solve those problems.

* * *

“Relevant research on the system as a whole has not been done… It is remarkable that with the continuing health care ‘crisis’, so few studies of the consequences of alternative modes of delivering care have been done. Such a paucity of studies is no accident; such studies would challenge structural interests of both professional monopoly (MD’s) and corporate rationalization in maintaining health institutions as they now exist or in directing their ‘orderly’ expansion.”

Robert R. Alford, Professor, UC Santa Cruz, Health Care Politics

“…It seems that public officials are afraid that if they make any move, or say anything antagonistic to the wishes of the medical organization, they will be pounced upon and destroyed. ..Public officials seem to be afraid of their jobs and even of their lives.”

US Senator Elmer Thomas, In Morris A. Bealle, The Drug Story. c. 1949 and 1976

“I think every doctor should know the shocking state of affairs…We discovered they (the FDA) failed to effectively regulate the large manufacturers and powerful interests while recklessly persecuting the small manufacturers. …(The FDA is) harassing (small) manufacturers and doctors…(and) betrays the public trust.”

Senator Edward V. Long. 1967

“The AMA protects the image of the food processors by its constant propaganda that the American food supply is the finest in the world, and that (those) who question this are simply practicing quackery. The food processors, in turn, protect the image of the AMA and of the drug manufacturers by arranging for the USDA and its dietitic cronies to blacklist throughout the country and in every public library, all nutrition books written for the layman, which preach simple, wholesome nutrition and attack …both the emasculation of natural foods and orthodox American medical care, which ignores subtle malnutrition and stresses drug therapy, (“as distinct from vitamin therapy”) for innumerable conditions. The drug manufacturers vigorously support the AMA since only MD’s can prescribe their products.”

Miles H. Robinson, MD; Professor, University of Pennsylvania and Vanderbilt Medical Schools, exhibit in Vitamin, Mineral, and Diet Supplements, Hearings, US House of Representatives, 1973

“The AMA puts the lives and well being of the American citizens well below it’s own special interest…It deserves to be ignored, rejected, and forgotten. No amount of historical gymnastics can hide the public record of AMA opposition to virtually every major health reform in the past 50 years….The AMA has turned into a propaganda organ purveying ‘medical politics’ for deceiving the Congress, the people, and the doctors of America themselves.”

Senator Edward Kennedy, in UPI National Chronicle, 1971

“The hearings have revealed police-state tactics…possibly perjured testimony to gain a conviction,…intimidation and gross disregard for the Constitutional Rights…(of) First, Fourth, Fifth, and Sixth Amendments, (by the FDA)
“The FDA (is) bent on using snooping gear to pry and invade…”
“Instance after instance of FDA raids on small vitamin and food supplement manufacturers. These small, defenseless businesses were guilty of producing products which FDA officials claimed were unnecessary.”
“If the FDA would spend a little less time and effort on small manufacturers of vitamins…and a little more on the large manufacturers of…dangerous drugs…, the public would be better served.”

Senator Long from various Senate hearings

“From about 1850 until the late 1930’s, one of the standing jokes in the medical profession, was about a few idiots who called themselves doctors, who claimed they could cure pneumonia by feeding their patients moldy bread. …Until…they discovered penicillin…in moldy bread!”

P.E. Binzel, MD, in Thomas Mansell, Cancer Simplified, 1977

“Penicillin sat on a shelf for ten years while I was called a quack.”

Sir Alexander Fleming.

“(in)”1914…Dr. Joseph Goldberger had proven that (pellagra) was related to diet, and later showed that it could be prevented by simply eating liver or yeast. But it wasn’t until the 1940’s…that the ‘modern’ medical world fully accepted pellagra as a vitamin B deficiency.”

G. Edward Griffin, World Without Cancer

“…The Chinese in the 9th century AD utilized a book entitled The Thousand Golden Prescriptions, which described how rice polish could be used to cure beri-beri, as well as other nutritional approaches to the prevention and treatment of disease. It was not until twelve centuries later that the cure for beri-beri was discovered in the West, and it acknowledged to be a vitamin B-1 deficiency disease.”

Jeffrey Bland, PhD, Your Health Under Siege: Using Nutrition to Fight Back

“The intolerance and fanaticism of official science toward Eijkman’s observations (that refined rice caused beri-beri) brought about the death of some half million people on the American continent in our own century alone between 1900 and 1910.”

Josue Castro, The Geography of Hunger

“In 1540…Ambroise Paré…persuaded doctors to stop the horrid practice of pouring boiling oil on wounds and required all doctors to wash thoroughly before delivering babies or performing surgery….(in) 1844…Ignaz Semmelweis in Vienna proved…that clean, well-scrubbed doctors would not infect and kill mothers at childbirth. For his efforts Semmelweis was dismissed from his hospital…(and) despite publication, his work was totally ignored. As a result he became insane and died in an asylum, and his son committed suicide.”
“As a chemist working for the US Government in 1916 on the island of Luzon (Philippines), (R.R.) Williams, over the opposition of orthodox medicine, had managed to eradicate beri-beri…by persuading the population to drink rice bran tea. In 1917, Williams was recalled to the US, and thereafter orthodox medicine discouraged anyone from drinking rice bran tea, so by 1920 there were more beri-beri deaths on Luzon than in 1915. ..In 1934, R.R. Williams (now) at Bell Telephone Labs., discovered thiamine (vitamin B-1), and that thiamine in rice bran both prevented and cured beri-beri.”
“Christian Eikman in Holland…shared the Nobel prize for Medicine in 1929 for Proving in 1892 that beri-beri was not an infectious disease…”

Wayne Martin, BS, Purdue University; Medical Heroes and Heretics, & “The Beri-beri analogy to myocardial infarction”, Medical Hypothesis

“In the 1850’s, Ignaz P. Semmelweis, a Hungarian doctor, discovered that childbed fever, which then killed about 12 mothers out of every 100, was contagious…and that doctors themselves were spreading the disease by not cleaning their hands. He was ridiculed…Opponents of his idea attacked him fiercely….(and) brought on (his) mental illness….(he) died a broken man.”

Salem Kirban, Health Guide for Survival

“…Galen…was…forced to flee Rome to escape the frenzy of the mob….Vesalius was denounced as an imposter and heretic…William Harvey was disgraced as a physician…William Roentgen…was called a quack and then condemned…”
“In…1535, when…Jacques Cartier found his ships…in…the St. Lawrence River, scurvy began…and then a friendly Indian showed them (that) tree bark and needles from the white pine – both rich in…Vitamin C – were stirred into a drink (for) swift recovery. Upon returning to Europe, Cartier reported this incident to the medical authorities. But they were amused by such ‘witch-doctor cures of ignorant savages’ and did nothing to follow it up…”
“It took over 200 years and cost hundreds of thousands of lives before the medical experts began to accept…Finally, in 1747, John Lind..discovered that oranges and lemons produced relief from scurvy…and yet it took 48 more years before his recommendation was put into effect….’Limeys’ would soon become rulers of the ‘Seven Seas’…”
“In 1593, Sir Richard Hawkins noted and later published, in observations on his voyage into the South Seas, references that natives of the area used sour oranges and lemons as a cure for scurvy, and a similar result was noted among his crew. …In 1804, regulations were introduced into the British Navy requiring use of lime juice….(and) into law by the British Board of Trade in 1865….It took two centuries to translate empirical observations into action…”

Maureen Salaman, MSc, Nutrition: the Cancer Answer

Most of the above quotes were found on a webpage put together by Wade Frazer (Medical Dark Ages Quotes). He gathered the quotes from Ralph Hovnanian’s 1990 book, Medical Dark Ages.

Plant-Based Nutritional Deficiencies

The purpose here is to highlight the nutritional deficiencies of plant-based diets but most specifically plant-exclusive diets such as veganism (important nutrients are listed below). Not all of these deficiencies involve essential nutrients, but our knowledge is limited on what is essential. There are deficiencies that will kill you quickly, others slowly, and still others that simply will cause deteriorating health or less than optimal functioning. Also, some of these nutrients or their precursors can be found in plant foods or otherwise produced by the body, but there can be several problems. The plant-based sources may be inadequate or not in the most bioavailable form, antinutrients in the plants may block the absorption of certain nutrients (e.g., phytates block mineral absorption), gut and microbiome problems related to a plant-based diet might interfere with absorption, and most people have severely limited capacity to turn certain precursors into the needed nutrients.

So, when eating a supposedly healthy diet, many vegans and vegetarians still have major deficiencies, even with nutrients that should be in their diet according to standard food intake calculations — in those cases, the nutrients are there in theory but for some reason not being absorbed or utilized. For example, raw spinach has a lot of calcium, but it is almost entirely unavailable to the body. Adding raw spinach to your smoothie or salad might be a net loss to your health, as the antinutrients will block the nutrients in other foods as well. Another factor is that, on a plant-based diet, nutrients can get out of ratio. Nutrients work together with some acting as precursors, others as catalysts, and still others like master hormones — such as vitamin K2 determining where calcium is transported to, preferably the bones as opposed to arteries, joints and the brain; or think about how the body can produce vitamin D3 but only if there is adequate cholesterol. As such, besides deficiencies, sometimes there can too much of a nutrient which interferes with another nutrient, as seen with copper in relation to zinc.

That is the advantage to an animal-based diet, which could even include a well-balanced vegetarian diet that emphasized dairy and eggs (Vegetarianism is an Animal-Based Diet), but unfortunately many vegetarians are near-vegan in limiting even those non-meat animal foods. Here is the reason why animal foods are so important. Other animals have similar nutritional needs as humans and so, when we eat animal foods, we are getting not only the nutrients our bodies need but in the required form and ratio for our own optimal functioning. Without animal foods, one has to study nutrition to understand all of this and then try to artificially re-create it through careful calculations in balancing what one eats and supplements, an almost impossible task that requires someone to have a scientific mindset. Even then, one is likely to get it wrong. Regular testing of nutritional levels would be absolutely necessary to ensure everything is going according to plan.

As for supplements and fortification, the nutrients aren’t always in the best form and so wouldn’t be as bioavailable nor would likely have all the needed cofactors in just the right amounts. Besides, a diet dependent on supplementation and fortification is not healthy by definition, in that the food itself in natural form lacksing those nutrients. The fact that most vegans in particular and vegetarians as well have to be extremely obsessive about nutrition just to maintain a basic level of health is not high praise to the health-giving benefits of such a plant-based diet — and hence the reason even vegetarians should emphasize the allowed animal foods (there are even vegans who will make exceptions for some animal foods, such as fish). This is probably why most people quit these diets after a short period of time and why most people who quit, including those who quit after years or decades, do so for health reasons. Among those who remain on these diets, their responses on surveys show that most of them cheat on occasion and so are getting some minimal level of animal-based nutrition, and that is a good thing for their health even as it calls into question the validity of health claims about plant-based diets (Being “mostly vegan” is like being “a little pregnant.”).

There has long been a bias against meat, especially red meat. It goes back to the ancient Greek thought of Galen and how it was adapted to Medieval society in being Christianized for purposes of maintaining social hierarchy and social control. This Galenic bias was carried forward in the Christian tradition and then modernized within nutrition studies through the surprisingly powerful influence of the Seventh Day Adventists who continue to fund a lot of nutritional studies to this day. This has had practical consequences. It has long been assumed, based on a theology of a sinful world, that eating animals would make us beastly. It’s similar to the ancient idea that eating the muscles or heart of a fallen warrior would make one strong or courageous. A similar logic was applied to plants, that they have inherent qualities that we can imbibe.

So, it has been long believed that plant foods are somehow healthier for both body and soul, somehow more spiritual and so would bring humans closer to God or else closer to their divine natural state before the Fall of Man. That has been the moral concern of many Christians, from Medieval Catholics to modern Seventh Day Adventists. And in secularized form, it became internalized by mainstream nutrition studies and dietary guidelines. Part of the purpose of eating plants, according to Christianized Galenism, was that a strong libido was considered bad and it was understood that a plant-based diet suppressed libido, which admittedly doesn’t sound like a sign of health but their idea of ‘health’ was very different. It was also worried that, along with firing up the libido, meat would heat up the entire body and would lead to a shorter lifespan. Pseudo-scientific explanations have been used to rationalize this theological doctrine, such as concerns about mTOR and IGF-1, although this requires contorting the science and dismissing other evidence.

The problem is this simply became built into mainstream nutritional ideology, to such an extent that few questioned it until recently. This has led to most researchers, nutritionists, dieticians, and other health experts to obsess over the nutrients in plants while overlooking the nutrients in animal foods. So, you’ll hear something along the lines of, “meat is not an important source of vitamin E and with the exception of liver, is not a particularly good source of fat-soluble vitamins” (Nutrients in Meat, from the Meat We Eat). Keep in mind that assertion comes from a project of the American Meat Science Association — not likely to be biased against meat. It’s sort of true, depending on how one defines meat. From Galenic thought, the notion of meat is still associated with red meat. It is true that muscle meat, particularly lean muscle meat, from beef, pork and veal doesn’t have much vitamin E compared to plant foods (M. Leonhardt et al, Vitamin E content of different animal products: influence of animal nutrition). This is why some vegetarians and even vegans see no contradiction or conflict, much less hypocrisy, in eating fish and fowl — culturally, these have for millennia been considered a separate category from meat.

Yet adequate amounts of vitamin E are found in many animal foods, whether or not we label them as ‘meat’: chicken, goose meat, fish, seafood, crayfish, butter, and cheese; and some vitamin E is also found in liver and eggs (Atli Anarson, 20 Foods That Are High in Vitamin E). We have to be clear what we mean by ‘meat’. On a meat-based diet, even to the degree of being carnivore, there are plentiful good sources of every essential nutrient, including vitamin E, and many that aren’t essential but highly conducive to optimal health. Besides animal foods, there is no other source of such immense nutrient-density and nutrient-biavailability. Plant foods don’t come close in comparison.

Also, as vitamin E is an antioxidant, it’s important to note that animal foods contain many other antioxidants that play a similar role in maintaining health, but animal-sourced antioxidants have been mostly ignored because they don’t fit the dominant plant-based paradigm. Plant foods lack these animal-sourced antioxidants. So why do so few talk about a deficiency in them for vegans and vegetarians? And why have researchers so rarely studied in depth the wide variety of nutrients in animal foods to determine their full health benefits? This is particularly odd when considering, as I already stated, every known essential nutrient can be found in animal foods but not in plant foods. Isn’t that an important detail? Why is there a collective silence among mainstream health experts?

Think about how plant antinutrients can block the absorption of nutrients, both in plant foods and animal foods, and so require even more nutrients to counteract this effect which might simply further increase the antinutrient intake, unless one is careful in following the food selection and preparation as advised by those like Steven Gundry (The Plant Paradox). Or think about how glucose competes with the antioxidant vitamin C causing an increase of scurvy if vitamin C is not increased, and yet a low-carb diet with far lower intake of vitamin C is not linked to scurvy — maybe the reason ancient Vikings and Polynesians could remain healthy at sea for months, but once a high-carb diet was introduced modern sailors were plagued by scurvy (Sailors’ Rations, a High-Carb Diet). Similarly, a plant-based diet in general might require greater amounts of vitamin E: “Plant-based foods have higher concentrations of vitamin E. And for good reason. A plant-based diet requires additional protection from oxidation of PUFA which Vitamin E helps provide through its antioxidant properties. It’s still found in adequate supply in meat” (Kevin Stock, Vitamins and Minerals – Plants vs Animals).

What is adequate depends on the diet. A diet low in carbs, seed oils, and other plant foods may require fewer plant-based antioxidants, especially if this is countered by an increase of animal-based antioxidants. It is reminiscent of the fiber debate. Yes, fiber adds bulk that supposedly will increase regularity, ignoring the fact that the research is divided on this topic. No doubt bulking up your poop makes you have larger poops and more often, but is that really a good thing? People on a low-residue carnivore diet more easily digest and absorb what the eat, and so they don’t have bulky poops — then again they don’t usually have constipation either, not if they’re getting enough dietary fat. The main cause of constipation is plant foods. So, why are people advised to eat more plant foods in the hope of resolving this issue caused by plant foods? It’s absurd! We keep looking at problems in isolation, as we look at nutrients in isolation (Hubris of Nutritionism). This has failed us, as demonstrated by our present public health crisis.

Let me throw in a last thought about antioxidants. It’s like the fiber issue. People on plant-based diets have contipation issues and so they eat more plant foods in the form of fiber in trying to solve the problem plant foods cause, not realizing that constipation generally resolves itself by eliminating or limiting plant foods. So, in relation to antioxidants, we have to ask ourselves what is it about our diet in the first place that is causing all the oxidative stress? Plant foods do have antioxidants, but some plant foods also cause oxidative stress (e.g., seed oils). If we eliminate these plant foods, our oxidative stress goes down and so our requirement of antioxidants to that degree also lessens. Our body already produces its own antioxidants and, combined with what comes from animal foods, we shouldn’t such excess amounts of antioxidants. Besides, it’s not clear from studies that plant antioxidants are always beneficial to health. It would be better to eliminate the need for them in the first place. Shawn Baker explained this in terms of vitamin C (interview with Shan Hussain, The Carnivore Diet with Dr. Shawn Baker MD):

“The Carnivore diet is deficient in carbohydrates and essential vitamins like Vitamin C, how do we make up for that? When I wanted to do this I was curious about this as well. You will see a number of potential deficiencies around this diet. There is no role of fibre in this diet. With Vitamin C we know there are some transporters across different cell membranes. In a higher glucose environment, Vitamin C is competitively inhibited and therefore we see less absorption of Vitamin C. We also see that interestingly human red blood cells do have the capacity to actually recycle Vitamin C which is something that not many people are aware of. One of the major function of Vitamin C is that it is an antioxidant. In low carbohydrate states our antioxidants systems particularly things like glutathione are regulated. We may obviate some of the need of antioxidants of the Vitamin C by regulating around systems in a low carb diet. Also, Vitamin C is very important in the function of carnitine which is part of the fat cycle. When we are ingesting carnitine we have actual transporters in the gut which can take up full carnosine. It is a misconception that we can only take amino acids, a number of di and tripeptide transporters that are contained within our gut. The other function of Vitamin C is when we don’t have sufficient Vitamin C relative to our needs, we start to develop symptoms of scurvy, bleeding gum problems, teeth falling out, sores and cuts won’t heal. This is all due to the collagen synthesis. If we look at Vitamin C’s role in collagen synthesis, it helps to take proline and lysine, hydroxyproline and hydroxylysine. In meat-based diet, we are getting that in ample amount. Even a steak has 3% of its content as collagen. There are all kinds of compensatory mechanisms.”

I’ll end on an amusing note. Chris Kresser wrote about the carnivore diet (Everything You Need to Know about the Carnivore Diet and How It Can Affect Your Health). Athough an advocate of low-carb diets and nutrient-dense animal foods, he is skeptical that carnivory will be healthy for most humans long-term. One worry is that there might be nutritional deficiencies, but the argument he makes is funny. He basically saying that if all one eats is muscle meat then key nutrients will get missed. Then he goes onto point out that these nutrients can be found in other animal foods, such as liver and dairy. So, his main concern about a carnivore diet is actually that people might not eat enough animal foods or rather not enough of certain animal foods. So, make sure you eat lots of a wide variety of animal foods if going full carnivore and apparently even critics like Kresser agree you’ll be fine, at least nutritionally. The problem isn’t too much animal foods but potentially too little. That made me smile.

Now to the whole point of this post. Below is a list of nutrients that are commonly deficient in those on plant-based diets, especially those on plant-exclusive diets (i.e., vegans). I won’t explain anything about these nutrients, as there is plenty of info online. But you can look to the linked articles below that cover the details.

  • Vitamin K2 (MK-4)
  • Vitamin D3 (Cholecalciferol)
  • Vitamin A (Retinol)
  • Vitamin B12 (Cobalamin)
  • Vitamin B6 (Pyridoxine)
  • B3 (Niacin)
  • B2 (Riboflavin)
  • Calcium
  • Heme Iron
  • Zinc
  • Selenium
  • Iodine
  • Sulfur
  • Creatine
  • Beta-Alanine
  • Carnosine
  • Beta-Alanine (Precursor to Carnosine)
  • L-Carnitine
  • Taurine
  • Choline
  • Coenzyme Q10 (CoQ10)
  • Phytanic Acid
  • DHA Omega-3 (Docosahexaenoic Acid)
  • EPA Omega-3 (Eicosapentaenoic Acid)
  • DPA Omega-3 (Docosapentaenoic Acid)
  • ARA Omega-6 (Arachidonic Acid)
  • CLA (Conjugated Linoleic Acid)
  • Phosphatidylserin
  • Cholesterol
  • Collagen
  • Complete Protein
  • Glutathione
  • Glycine
  • Essential Amino Acids (Methionine, Tryptophan, Lysine, Leucine, Cysteine, Proline, Tyrosine, Phenylalanine, Serine, Alanine, Threonine, Isoleucine and Valine)

Just for the sake of balance, I’ll also share a list of plant compounds that are problematic for many people — from Joe Cohen (20 Nutrients that Vegans & Vegetarians are Lacking):

  1. Lectins
  2. Amines
  3. Tannins
  4. Trypsin Inhibitors
  5. FODMAPS
  6. Salicylates
  7. Oxalates
  8. Sulfites, Benzoates, and MSG
  9. Non-protein amino acids
  10. Glycosides
  11. Alkaloids [includes solanine, chaconine]
  12. Triterpenes
  13. Lignins
  14. Saponins
  15. Phytic Acid [Also Called Phytate]
  16. Gluten
  17. Isoflavones

* * *

Are ‘vegetarians’ or ‘carnivores’ healthier?
Gundry’s Plant Paradox and Saladino’s Carnivory
Dr. Saladino on Plant and Animal Foods
True Vitamin A For Health And Happiness
Calcium: Nutrient Combination and Ratios
Vitamin D3 and Autophagy

The Vegetarian Myth: Food, Justice, and Sustainability
by Lierre Keit

Vegan Betrayal: Love, Lies, and Hunger in a Plants-Only World
by Mara J. Kahn

The Meat Fix: How a lifetime of healthy eating nearly killed me!
by John Nicholson

The Fat of the Land/Not By Bread Alone
by Vilhjalmur Stefansson

Sacred Cow: The Case for (Better) Meat: Why Well-Raised Meat Is Good for You and Good for the Planet
by Diana Rodgers and Robb Wolf

The Carnivore Code: Unlocking the Secrets to Optimal Health by Returning to Our Ancestral Diet
by Paul Saladino

Primal Body, Primal Mind: Beyond Paleo for Total Health and a Longer Life
by Nora Gedgauda

Paleo Principles
by Sarah Ballantyn

 
The Queen of Fats: Why Omega-3s Were Removed from the Western Diet and What We Can Do to Replace Them
by Susan Allport

The Omega Principle: Seafood and the Quest for a Long Life and a Healthier Planet
by Paul Greenberg

The Omega-3 Effect: Everything You Need to Know About the Super Nutrient for Living Longer, Happier, and Healthier
by William Sears and James Sear

The Missing Wellness Factors: EPA and DHA: The Most Important Nutrients Since Vitamins?
by Jorn Dyerberg and Richard Passwater

Could It Be B12?: An Epidemic of Misdiagnoses
by Sally M. Pacholok and Jeffrey J. Stuar

What You Need to Know About Pernicious Anaemia and Vitamin B12 Deficiency
by Martyn Hooper

Living with Pernicious Anaemia and Vitamin B12 Deficiency
by Martyn Hoope

Pernicious Anaemia: The Forgotten Disease: The causes and consequences of vitamin B12 deficiency
by Martyn Hooper

Healing With Iodine: Your Missing Link To Better Health
by Mark Sircus

Iodine: Thyroid: The Hidden Chemical at the Center of Your Health and Well-being
by Jennifer Co

The Iodine Crisis: What You Don’t Know About Iodine Can Wreck Your Life
by Lynne Farrow

L-Carnitine and the Heart
by Stephen T. Sinatra and Jan Sinatra

Food Politics: How the Food Industry Influences Nutrition and Health
by Marion Nestle

Unsavory Truth: How Food Companies Skew the Science of What We Eat
by Marion Nestle

Formerly Known As Food: How the Industrial Food System Is Changing Our Minds, Bodies, and Culture
by Kristin Lawless

Death by Food Pyramid: How Shoddy Science, Sketchy Politics and Shady Special Interests Have Ruined Our Health
by Denise Minge

Nutrition in Crisis: Flawed Studies, Misleading Advice, and the Real Science of Human Metabolism
by Richard David Feinman

Nutritionism: The Science and Politics of Dietary Advice
by Gyorgy Scrinis

Measured Meals: Nutrition in America
by Jessica J. Mudry

(Although more about macronutrients, also see the work of Gary Taubes and Nina Teicholz. They add useful historical context about nutrition studies, dietary advice, and public health.)

20 Nutrients that Vegans & Vegetarians are Lacking
by Joe Cohen

8 Nutrients You May Be Missing If You’re Vegetarian or Vegan
by Tina Donvito

7 Nutrients That You Can’t Get from Plants
by Atli Anarson

7 Supplements You Need on a Vegan Diet
by Alina Petre

The Top 5 Nutrient Deficiencies on a Plant Based Diet
by Kate Barrington

5 Brain Nutrients That You Can’t Get From Plants
by Kris Gunnars

Vitamin Supplements for Vegetarians
by Jeff Takacs

Health effects of vegan diets
by Winston J Craig

Nutritional Deficiencies and Essential Considerations for Every Vegan (An Evidence-Based Nutritional Perspective)
from Dai Manuel

Why You Should Think Twice About Vegetarian and Vegan Diets
by Chris Kresser

Three Big Reasons Why You Don’t Want to be a Vegetarian
by Alan Sears

How to Avoid Common Nutrient Deficiencies if You’re a Vegan
by Joseph Mercola

What is Glutathione and How Do I Get More of It?
by Mark Hyman

Could THIS Be the Hidden Factor Behind Obesity, Heart Disease, and Chronic Fatigue?
by Joseph Mercola

Vegetarianism produces subclinical malnutrition, hyperhomocysteinemia and atherogenesis
by Y. Ingenbleek Y and K. S. McCully

Vegan Diet is Sulfur Deficient and Heart Unhealthy
by Larry H. Bern

Heart of the Matter : Sulfur Deficits in Plant-Based Diets
by Kaayla Daniel

Copper-Zinc Imbalance: Unrecognized Consequence of Plant-Based Diets and a Contributor to Chronic Fatigue
by Laurie Warner

Vegan diets ‘risk lowering intake of nutrient critical for unborn babies’ brains’
by Richard Hartley-Parkinson

The Effects of a Mother’s Vegan Diet on Fetal Development
by Marc Choi

Vegan–vegetarian diets in pregnancy: danger orpanacea? A systematic narrative review
by G. B. Piccoli

Is vegetarianism healthy for children?
by Nathan Cofnas

Clinical practice: vegetarian infant and child nutrition
by M. Van Winckel, S. Vande Velde, R. De Bruyne, and S. Van Biervliet

Dietary intake and nutritional status of vegetarian and omnivorous preschool children and their parents in Taiwan
C. E. Yen, C. H. Yen, M. C. Huang, C. H. Cheng, and Y. C. Huang

Persistence of neurological damage induced by dietary vitamin B-12 deficiency in infancy
by Ursula von Schenck, Christine Bender-Götze, and Berthold Koletzko

Severe vitamin B12 deficiency in an exclusively breastfed 5-month-old Italian infant born to a mother receiving multivitamin supplementation during pregnancy
by S. Guez et al

Long-chain n-3 PUFA in vegetarian women: a metabolic perspective
by G. C. Burdge, S. Y. Tan, and C. J. Henry

Signs of impaired cognitive function in adolescents with marginal cobalamin status
by M. W. Louwman et al

Transient neonatal hypothyroidism due to a maternal vegan diet
by M. G. Shaikh, J. M. Anderson, S. K. Hall, M. A. Jackson

Veganism as a cause of iodine deficient hypothyroidism
by O. Yeliosof and L. A. Silverman

Do plant based diets deprive the brain of an essential nutrient?
by Ana Sandoiu

Suggested move to plant-based diets risks worsening brain health nutrient deficiency
from BMJ

Could we be overlooking a potential choline crisis in the United Kingdom?
by Emma Derbyshire

How a vegan diet could affect your intelligence
by Zaria Gorvett

Vitamins and Minerals – Plants vs Animals
by Kevin Stock

Health effects of vegan diets
by Winston J Craig

Comparing Glutathione in the Plasma of Vegetarian and Omnivore Populations
by Rachel Christine Manley

Vegan diets are adding to malnutrition in wealthy countries
by Chris Elliott, Chen Situ, and Claire McEvoy

What beneficial compounds are primarily found in animal products?
by Kamal Patel

The Brain Needs Animal Fat
by Georgia Ede

The Vegan Brain
by Georgia Ede

Meat, Organs, Bones and Skin
by Christopher Masterjohn

Vegetarianism and Nutrient Deficiencies
by Christopher Masterjohn

Adding milk, meat to diet dramatically improves nutrition for poor in Zambia
from Science Daily

Red meat plays vital role in diets, claims expert in fightback against veganism
by James Tapper

Nutritional Composition of Meat
by Rabia Shabir Ahmad, Ali Imran and Muhammad Bilal Hussain

Meat and meat products as functional food
by Maciej Ostaszewski

Meat: It’s More than Protein
from Paleo Leap

Conjugated Linoleic Acid: the Weight Loss Fat?
from Paleo Leap

Nutritional composition of red meat
by P. G. Williams

How Red Meat Can ‘Beef Up’ Your Nutrition
by David Hu

Endogenous antioxidants in fish
by Margrét Bragadóttir

Astaxanthin Benefits Better than Vitamin C?
by Rachael Link

Astaxanthin: The Most Powerful Antioxidant You’ve Never Heard Of
from XWERKS

Antioxidants Are Bullshit for the Same Reason Eggs Are Healthy
by Sam Westreich

We absolutely need fruits and vegetables to obtain optimal antioxidant status, right?
by Paul Saladino

Hen Egg as an Antioxidant Food Commodity: A Review
Chamila Nimalaratne and Jianping Wu

Eggs’ antioxidant properties may help prevent heart disease and cancer, study suggests
from Science Daily

The Ultimate Superfood? Milk Offers Up a Glass Full of Antioxidants
by Lauren Milligan Newmark

Antioxidant properties of Milk and dairy products: a comprehensive review of the current knowledge
by Imran Taj Khan et al

Antioxidants in cheese may offset blood vessel damage
from Farm and Dairy

Identification of New Peptides from Fermented Milk Showing Antioxidant Properties: Mechanism of Action
by Federica Tonolo

Bioavailability of iron, zinc, and other trace minerals from vegetarian diets
by Janet R Hunt

Dietary iron intake and iron status of German female vegans: results of the German vegan study.
by A. Waldmann, J. W. Koschizke, C. Leitzmann, and A. Hahn

Mechanisms of heme iron absorption: Current questions and controversies
by Adrian R. West and Phillip S. Oates

Association between Haem and Non-Haem Iron Intake and Serum Ferritin in Healthy Young Women
by Isabel Young et al

Pork meat increases iron absorption from a 5-day fully controlled diet when compared to a vegetarian diet with similar vitamin C and phytic acid content.
by M. Bach Kristensen, O. Hels, C. Morberg, J. Marving, S. Bügel, and I. Tetens

Do you need fiber?
by Kevin Stock

Americans Fatter at Same Level of Food Intake and Exercise

Americans, to state the obvious, are unhealthier with each passing generation. And the most obvious sign of this is rising obesity rate. In one analysis, this was shown to be true even when controlling for levels of food intake and exercise (see article below). This is the kind of data that undermines conventional dietary advice based on Christian moralizing about the deadly sins of gluttony and sloth.

Heart attacks and obesity first became a public health concern in the 1940s and 1950s. That was following decades of seed oil and margarine consumption having mostly replaced lard in the American diet. We were told that saturated fat is dangerous and that seed oils were great for health. Americans were listening and they strictly followed this advice. Even restaurants stopped cooking their french fries in tallow.

In particular, olive oil has been sold as the best. Why is olive oil supposed to be so healthy? Because it has monounsaturated fat, the same as is primarily found in lard. Not too long ago, the healthiest population in the United States was in Roseto, Pennyslvania. Guess what was their main source of fat? Lard. They also ate massive loads of meat, as do other long-lived populations in the world such as in Hong Kong.

Red meat also decreased over that period and has continued to increase since then. Dairy has followed this pattern of decline. Americans are eating less animal fats now than ever before in American history or probably human existence. It’s true that Americans are eating more lean chicken and fish, but we were told those are healthy for us. Meanwhile, Americans are eating more fruits and vegetables, nuts and seeds than ever before.

Calories-in/calories-out has been an utter failure. It’s not how much we are eating but what we are eating. That then determines how our metabolism functions, whether it burns fat or stores it. Exercise is largely irrelevant for fat loss. Fat people can exercise all the time and not lose weight, while some skinny people hardly move at all. Another study “demonstrated that there is no difference in total energy expenditure between traditional hunter-gathers, subsistence farmers and modern Westerners.”

One explanation is an increase of obesogens. These are chemicals that cause the body to create fat. In general, fat is where the body stores excess toxins that overwhelm the body. And indeed younger Americans are exposed to more toxins. Then this makes losing weight hard because all the toxins get released and make one feel like shit. It’s hard for the body to eliminate a lifetime of accumulated toxicity. On top of that, the young are prescribed more medications than ever before. Antidepressants and antipsychotics have been given out like candy for anyone with mild mental issues. What is a common side effect of these drugs? Yep, weight gain.

A third possibility is more complex. We know the gut microbiome has shrunk in number and diversity. It’s also changed in the profile of bacteria. Research is showing how important is the microbiome (see The Secrete Life of Your Microbiome by Susan L. Prescott and Alan C. Logan). Toxins and drugs, by the way, also alter the microbiome. So does diet. Even if total calorie intake hasn’t changed much relative to the increased height of the population, what has changed is what we are eating.

In place of animal fats, we are eating not only more seed oils but also more carbs and sugar. Animal fats are highly satiating and so food companies realized they needed to find something equally satiating. It turns out a high-carb diet is not only satiating but addictive. It knocks people out of ketosis and causes them to put on weight. It doesn’t matter if one tries to eat less. In processed foods, when carbs are combined with seed oils, the body is forced to burn the carbs immediately and so it has no choice but to turn the seed oils into fat.

By the way, what alters metabolism also alters the microbiome. This is seen when people go from a high-carb diet to a ketogenic diet. Ketosis is powerful in its impact on how the body functions in so many ways, even changing epigenetic expression of genes. Here is the worst part. Those epigenetic changes have been happening for generations with the loss of regular ketosis. Even epigenetics for obesity, following an environmental trigger like famine, have been shown to pass on across multiple generations. The microbiome, of course, also is inherited and each of those bacteria likewise have an epigenome that determines their genetic expression.

Everything we do as individuals, good and bad, doesn’t only affect us as individuals. People are getting fatter now not only because of what they are doing differently but because of everything that was done by their parents, grandparents, and great-grandparents. As I’ve said before, even if we reversed all these changes instantly, as we are unlikely to do, it would still require generations to fully reverse the consequences.

* * *

Why It Was Easier to Be Skinny in the 1980s
by Olga Khazan

A study published recently in the journal Obesity Research & Clinical Practice found that it’s harder for adults today to maintain the same weight as those 20 to 30 years ago did, even at the same levels of food intake and exercise. […]

Just what those other changes might be, though, are still a matter of hypothesis. In an interview, Kuk proffered three different factors that might be making harder for adults today to stay thin.

First, people are exposed to more chemicals that might be weight-gain inducing. Pesticides, flame retardants, and the substances in food packaging might all be altering our hormonal processes and tweaking the way our bodies put on and maintain weight.

Second, the use of prescription drugs has risen dramatically since the 1970s and ’80s. Prozac, the first blockbuster SSRI, came out in 1988. Antidepressants are now one of the most commonly prescribed drugs in the U.S., and many of them have been linked to weight gain.

Finally, Kuk and the other study authors think that the microbiomes of Americans might have somehow changed between the 1980s and now. It’s well known that some types of gut bacteria make a person more prone to weight gain and obesity. Americans are eating more meat than they were a few decades ago, and many animal products are treated with hormones and antibiotics in order to promote growth. All that meat might be changing gut bacteria in ways that are subtle, at first, but add up over time. Kuk believes that the proliferation of artificial sweeteners could also be playing a role.

Why Do Americans Keep Getting Fatter?
by Chris Bodenner

Notwithstanding the known errors of dietary assessment, it is interesting that we observe consistent trends over time in terms of how dietary intake relates with obesity and how this relationship has changed over time. This lends more confidence to our primary findings and suggests that there are either physiological changes in how diet relates with body weight or differences in how individuals are reporting their dietary intake over time. […]

[W]e observed that the BMI associated with a given leisure time physical activity frequency was still higher over time in men. This may be attributed to changes in non-leisure time physical activity such as reductions in occupational physical activity or increasing screen time. However, a study using doubly labelled water demonstrated that there is no difference in total energy expenditure between traditional hunter-gathers, subsistence farmers and modern Westerners. Thus, numerous other factors in addition to energy intake and physical activity may be important to consider when trying to explain the rise in obesity, and should be further evaluated in further studies.

Dietary Risk Factors for Heart Disease and Cancer

Based on a study of 42 European countries, a recent scientific paper reported that, “the highest CVD [cardiovascular disease] prevalence can be found in countries with the highest carbohydrate consumption, whereas the lowest CVD prevalence is typical of countries with the highest intake of fat and protein.” And that, “The positive effect of low-carbohydrate diets on CVD risk factors (obesity, blood lipids, blood glucose, insulin, blood pressure) is already apparent in short-term clinical trials lasting 3–36 months (58) and low-carbohydrate diets also appear superior to low-fat diets in this regard (36, 37).” Basically, for heart health, this would suggest eating more full-fat dairy, eggs, meat, and fish while eating less starches, sugar, and alcohol. That is to say, follow a low-carb diet. It doesn’t mean eat any low-carb diet, though, for the focus is on animal foods.

By the way, when you dig into the actual history of the Blue Zones (healthy, long-lived populations), what you find is that their traditional diets included large portions of animal foods, including animal fat (Blue Zones Dietary Myth, Eat Beef and Bacon!, Ancient Greek View on Olive Oil as Part of the Healthy Mediterranean Diet). The longest-lived society in the entire world, in fact, is also the one with the highest meat consumption per capita, even more than Americans. What society is that? Hong Kong. In general, nutrition studies in Asia has long shown that those eating more meat have the best health outcomes. This contradicts earlier Western research, as we’re dealing with how the healthy user effect manifests differently according to culture. But even in the West, the research is ever more falling in line with the Eastern research, such as with the study I quoted above. And that study is far from being the only one (Are ‘vegetarians’ or ‘carnivores’ healthier?).

This would apply to both meat-eaters and vegetarians, as even vegetarians could put greater emphasis on nutrient-dense animal foods. It is specifically saturated fat and animal proteins that were most strongly associated with better health, both of which could be obtained from dairy and eggs. Vegans, on the other hand, would obviously be deficient in this area. But certain plant foods (tree nuts, olives, citrus fruits, low-glycemic vegetables, and wine, though not distilled beverages) also showed some benefit. Considering plant foods, those specifically associated with greater risk of heart disease, strokes, etc were those high in carbohydrates such as grains. Unsurprisingly, sunflower oil was a risk factor, probably related to seed oils being inflammatory and oxidative (not to mention mutagenic); but oddly onions were also likewise implicated, if only weakly. Other foods showed up in the data, but the above were the most interesting and important.

Such correlations, of course, can’t prove causation. But it fits the accumulating evidence: “These findings strikingly contradict the traditional ‘saturated fat hypothesis’, but in reality, they are compatible with the evidence accumulated from observational studies that points to both high glycaemic index and high glycaemic load (the amount of consumed carbohydrates × their glycaemic index) as important triggers of CVDs. The highest glycaemic indices (GI) out of all basic food sources can be found in potatoes and cereal products, which also have one of the highest food insulin indices (FII) that betray their ability to increase insulin levels.” All of that seems straightforward, according to the overall data from nutrition studies (see: Uffe Ravnskov, Richard Smith, Robert Lustig, Eric Westman, Ben Bikman, Gary Taubes, Nina Teicholz, etc). About saturated fat not being linked to CVD risk, Andrew Mente discusses a meta-analysis he worked on and another meta-analysis by another group of researchers, Siri-Tarino PW et al (New Evidence Reveals that Saturated Fat Does Not Increase the Risk of Cardiovascular Disease). Likewise, many experts no longer see cholesterol as a culprit either (Uffe Ravnskov et al, LDL-C does not cause cardiovascular disease: a comprehensive review of the current literature).

Yet one other odd association was discovered: “In fact, our ecological comparison of cancer incidence in 39 European countries (for 2012; (59)) can bring another important argument. Current rates of cancer incidence in Europe are namely the exact geographical opposite of CVDs (see Fig. 28). In sharp contrast to CVDs, cancer correlates with the consumption of animal food (particularly animal fat), alcohol, a high dietary protein quality, high cholesterol levels, high health expenditure, and above average height. These contrasting patterns mirror physiological mechanisms underlying physical growth and the development of cancer and CVDs (60). The best example of this health paradox is again that of French men, who have the lowest rates of CVD mortality in Europe, but the highest rates of cancer incidence. In other words, cancer and CVDs appear to express two extremes of a fundamental metabolic disbalance that is related to factors such as cholesterol and IGF-1 (insulin-like growth factor).”

That is an argument people have made, but it’s largely been theoretical. In response, others have argued the opposite position (High vs Low Protein, Too Much Protein?, Gundry’s Plant Paradox and Saladino’s Carnivory, Carcinogenic Grains). It’s true that, for example, eating meat increases IGF-1, at least temporarily. Then again, eating in general does the same. And on a diet low enough in carbs, it’s been shown in studies that people naturally reduce their calorie intake, which would reduce IGF-1. And for really low-carb, the ketogenic diet is specifically defined as being low in animal protein while higher in fat. A low-carb diet is not necessarily a high-animal protein diet, especially when combined with intermittent fasting such as OMAD (one meal a day) with long periods of downregulated IGF-1. Also, this study didn’t appear to include plant proteins in the data, and so we don’t know if eating lots of soy, hemp protein powder, etc would show similar results; although nuts were mentioned in the report as being similar to meat in correlating to CVD health but, as far as I know, not mentioned in terms of cancer. What would make animal proteins more carcinogenic than plant proteins or, for that matter, plant carbohydrates? The hypothetical mechanism is not clear.

This anomaly would’ve been more interesting if the authors had surveyed the research literature. It’s hard to know what to make of it since other studies have pointed to the opposite conclusion, that the risks of these two are closely linked, rather than being inversely associated: “Epidemiologically, a healthy lifestyle lessens the risk of both cardiovascular disease and cancer, as first found in the Nurses’ Health study” (Lionel Opie, Cancer and cardiovascular disease; see Rob M. Van Dam, Combined impact of lifestyle factors on mortality). “Research has shown there are interrelationships among type 2 diabetes, heart disease, and cancer. These interrelationships may seem coincidental and based only on the fact these conditions share common risk factors. However, research suggests these diseases may relate to one another in multiple ways and that nutrition and lifestyle strategies used to prevent and manage these diseases overlap considerably” (Karen Collins, The Cancer, Diabetes, and Heart Disease Link).

Yet other researchers did find the same inverse relationship: “We herein report that, based on two separate medical records analysis, an inverse correlation between cancer and atherosclerosis” (Matthew Li et al, If It’s Not One Thing, It’s Another). But there was an additional point: “We believe that the anti-inflammatory aspect of cancer’s pan-inflammatory response plays an important role towards atherosclerotic attenuation.” Interesting! In that case, one of the key causal mechanisms to be considered is inflammation. Some diets high in animal proteins would be inflammatory, such as the Standard American Diet, whereas others would be anti-inflammatory. Eliminating seed oils (e.g., sunflower oil) would by itself reduce inflammation. Reducing starches and sugar would help as well. So, is it the meat that increases cancer or is it what the meat is being cooked in or eaten with? That goes back to the healthy and unhealthy user effects.

As this confounding factor is central, we might want to consider the increasingly common view that inflammation is involved in nearly every major disease. “For example, inflammation causes or is a causal link in many health problems or otherwise seen as an indicator of health deterioration (arthritis, depression, schizophrenia, etc), but inflammation itself isn’t the fundamental cause since it is a protective response itself to something else (allergens, leaky gut, etc). Or as yet another example, there is the theory that cholesterol plaque in arteries doesn’t cause the problem but is a response to it, as the cholesterol is essentially forming a scab in seeking to heal injury. Pointing at cholesterol would be like making accusations about firefighters being present at fires” (Coping Mechanisms of Health).

What exacerbates or moderates inflammation will be pivotal to overall health (Essentialism On the Decline), especially the nexus of disease called metabolic syndrome/derangement or what used to be called syndrome X: insulin resistance, diabetes, obesity, heart disease, strokes, etc. In fact, other researchers point directly to inflammation as being a common factor of CVD and cancer: “Although commonly thought of as two separate disease entities, CVD and cancer possess various similarities and possible interactions, including a number of similar risk factors (e.g. obesity, diabetes), suggesting a shared biology for which there is emerging evidence. While chronic inflammation is an indispensible feature of the pathogenesis and progression of both CVD and cancer, additional mechanisms can be found at their intersection” (Ryan J. Koene et al, Shared Risk Factors in Cardiovascular Disease and Cancer). But it might depend on the specific conditions how inflammation manifests as disease — not only CVD or cancer but also arthritis, depression, Alzheimer’s, etc.

This is the major downfall of nutrition studies, as the experts in the field find themselves hopelessly mired in a replication crisis. There is too much contradictory research and, when much of the research has been repeated, it simply did not replicate. That is to say much of it is simply wrong or misinterpreted. And as few have attempted to replicate much of it, we aren’t entirely sure what is valid and what is not. That further problemetizes meta-analyses, despite how potentially powerful that tool can be when working with quality research. The study I’ve been discussing here was an ecological study and that has its limitations. The researchers couldn’t disentangle all the major confounding factors, much less control for them in the first place, as they were working with data across decades that came from separate countries. Even so, it’s interesting and useful info to consider. And keep in mind that almost all official dietary recommendations are based on observational (associative, correlative, epidemiological) studies with far fewer controls. This is the nature of the entire field of nutrition studies, as long-term randomized and controlled studies on humans are next to impossible to do.

So, as always, qualifications must be made. The study’s authors state that, “In items of smaller importance (e.g. distilled beverages, sunflower oil, onions), the results are less persuasive and their interpretation is not always easy and straightforward. Similar to observational studies, our ecological study reflects ‘real-world data’ and cannot always separate mutual interactions among the examined variables. Therefore, the reliance on bivariate correlations could lead to misleading conclusions. However, some of these findings can be used as a starting point of medical hypotheses, whose validity can be investigated in controlled clinical trials.” Nonetheless, “The reasonably high accuracy of the input data, combined with some extremely high correlations, together substantially increase the likelihood of true causal relationships, especially when the results concern principal components of food with high consumption rates, and when they can be supported by other sources.”

This data is meaningful in offering strong supporting evidence. The finding about animal foods and starchy foods is the main takeaway, however tentative the conclusion may be for real world application, at least in taking this evidence in isolation. But the inverse correlation of CVD risk and cancer risk stands out and probably indicates confounders across populations, and that would be fertile territory for other researchers to explore. The main importance to this study is less in the specifics and more in how it further challenges the broad paradigm that has dominated nutrition studies for the past half century or so. The most basic point is that the diet-heart hypothesis simply doesn’t make sense of the evidence and it never really did. When the hypothesis was first argued, heart disease was going up precisely at the moment saturated fat intake was going down, since seed oils had replaced lard as the main fat source in the decades prior. Interestingly, lard has been a common denominator among most long-lived populations, from the Okinawans to Rosetans (Ancient Greek View on Olive Oil as Part of the Healthy Mediterranean Die, Blue Zones Dietary Myth).

This study is further support for a new emerging understanding, as seen with the American Heart Association backing off from its earlier position (Slow, Quiet, and Reluctant Changes to Official Dietary Guidelines). Fat is not the enemy of humanity, as seen with the high-fat ketogenic diet where fat is used as the primary fuel, instead of carbohydrates (Ketogenic Diet and Neurocognitive Health, The Ketogenic Miracle Cure, The Agricultural Mind). In fact, we wouldn’t be here without fat, as it is the evolutionary and physiological norm, specifically in terms of low-carb (Is Ketosis Normal?, “Is keto safe for kids?”). Instead, that too many carbohydrates are unhealthy used to be common knowledge (American Heart Association’s “Fat and Cholesterol Counter” (1991)). Consensus on this shifted a half century ago, the last time when low-carb diets were still part of mainstream thought, and now we are shifting back the other way. The old consensus will be new again.

* * *

Carbohydrates, not animal fats, linked to heart disease across 42 European countries
by Keir Watson

Key findings

  • Cholesterol levels were tightly correlated to the consumption of animal fats and proteins – Countries consuming more fat and protein from animal sources had higher incidence of raised cholesterol
  • Raised cholesterol correlated negatively with CVD risk – Countries with higher levels of raised cholesterol had fewer cases of CVD deaths and a lower incidence of CVD risk factors
  • Carbohydrates correlated positively with CVD risk – the more carbohydrates consumed (and especially those with high GI such as starches) the more CVD
  • Fat and Protein correlated negatively with CVD risk – Countries consuming more fat and protein from animal and plant sources had less CVD. The authors speculate that this is because increasing fat and protein in the diet generally displaces carbohydrates.

Food consumption and the actual statistics of cardiovascular diseases: an epidemiological comparison of 42 European countries
Pavel Grasgruber,* Martin Sebera, Eduard Hrazdira, Sylva Hrebickova, and Jan Cacek

Results

We found exceptionally strong relationships between some of the examined factors, the highest being a correlation between raised cholesterol in men and the combined consumption of animal fat and animal protein (r=0.92, p<0.001). The most significant dietary correlate of low CVD risk was high total fat and animal protein consumption. Additional statistical analyses further highlighted citrus fruits, high-fat dairy (cheese) and tree nuts. Among other non-dietary factors, health expenditure showed by far the highest correlation coefficients. The major correlate of high CVD risk was the proportion of energy from carbohydrates and alcohol, or from potato and cereal carbohydrates. Similar patterns were observed between food consumption and CVD statistics from the period 1980–2000, which shows that these relationships are stable over time. However, we found striking discrepancies in men’s CVD statistics from 1980 and 1990, which can probably explain the origin of the ‘saturated fat hypothesis’ that influenced public health policies in the following decades.

Conclusion

Our results do not support the association between CVDs and saturated fat, which is still contained in official dietary guidelines. Instead, they agree with data accumulated from recent studies that link CVD risk with the high glycaemic index/load of carbohydrate-based diets. In the absence of any scientific evidence connecting saturated fat with CVDs, these findings show that current dietary recommendations regarding CVDs should be seriously reconsidered. […]

Irrespective of the possible limitations of the ecological study design, the undisputable finding of our paper is the fact that the highest CVD prevalence can be found in countries with the highest carbohydrate consumption, whereas the lowest CVD prevalence is typical of countries with the highest intake of fat and protein. The polarity between these geographical patterns is striking. At the same time, it is important to emphasise that we are dealing with the most essential components of the everyday diet.

Health expenditure – the main confounder in this study – is clearly related to CVD mortality, but its influence is not apparent in the case of raised blood pressure or blood glucose, which depend on the individual lifestyle. It is also difficult to imagine that health expenditure would be able to completely reverse the connection between nutrition and all the selected CVD indicators. Therefore, the strong ecological relationship between CVD prevalence and carbohydrate consumption is a serious challenge to the current concepts of the aetiology of CVD.

The positive effect of low-carbohydrate diets on CVD risk factors (obesity, blood lipids, blood glucose, insulin, blood pressure) is already apparent in short-term clinical trials lasting 3–36 months (58) and low-carbohydrate diets also appear superior to low-fat diets in this regard (36, 37). However, these findings are still not reflected by official dietary recommendations that continue to perpetuate the unproven connection between saturated fat and CVDs (25). Understandably, because of the chronic nature of CVDs, the evidence for the connection between carbohydrates and CVD events/mortality comes mainly from longitudinal observational studies and there is a lack of long-term clinical trials that would provide definitive proof of such a connection. Therefore, our data based on long-term statistics of food consumption can be important for the direction of future research.

In fact, our ecological comparison of cancer incidence in 39 European countries (for 2012; (59)) can bring another important argument. Current rates of cancer incidence in Europe are namely the exact geographical opposite of CVDs (see Fig. 28). In sharp contrast to CVDs, cancer correlates with the consumption of animal food (particularly animal fat), alcohol, a high dietary protein quality, high cholesterol levels, high health expenditure, and above average height. These contrasting patterns mirror physiological mechanisms underlying physical growth and the development of cancer and CVDs (60). The best example of this health paradox is again that of French men, who have the lowest rates of CVD mortality in Europe, but the highest rates of cancer incidence. In other words, cancer and CVDs appear to express two extremes of a fundamental metabolic disbalance that is related to factors such as cholesterol and IGF-1 (insulin-like growth factor).

Besides total fat and protein consumption, the most likely preventive factors emerging in our study include fruits (particularly citrus fruits), wine, high-fat dairy products (especially cheese), sources of plant fat (tree nuts, olives), and potentially even vegetables and other low-glycaemic plant sources, provided that they substitute high-glycaemic foods. Many of these foodstuffs are the traditional components of the ‘Mediterranean diet’, which again strengthens the meaningfulness of our results. The factor analysis (Factor 3) also highlighted coffee, soybean oil and fish & seafood, but except for the fish & seafood, the rationale of this finding is less clear, because coffee is strongly associated with fruit consumption and soybean oil is used for various culinary purposes. Still, some support for the preventive role of coffee does exist (61) and hence, this observation should not be disregarded.

Similar to the “Mediterranean diet”, the Dietary Approaches to Stop Hypertension (DASH) diet, which is based mainly on fruits, vegetables, and low-fat dairy, also proved to be quite effective (62). However, our data indicate that the consumption of low-fat dairy may not be an optimal strategy. Considering the unreliability of observational studies highlighting low-fat dairy and the existence of strong bias regarding the intake of saturated fat, the health effect of various dairy products should be carefully tested in controlled clinical studies. In any case, our findings indicate that citrus fruits, high-fat dairy (such as cheese) and tree nuts (walnuts) constitute the most promising components of a prevention diet.

Among other potential triggers of CVDs, we should especially stress distilled beverages, which consistently correlate with CVD risk, in the absence of any relationship with health expenditure. The possible role of sunflower oil and onions is much less clear. Although sunflower oil consistently correlates with stroke mortality in the historical comparison and creates very productive regression models with some correlates of the actual CVD mortality, it is possible that both these food items mirror an environment that is deficient in some important factors correlating negatively with CVD risk.

A very important case is that of cereals because whole grain cereals are often propagated as CVD prevention. It is true that whole grain cereals are usually characterised by lower GI and FII values than refined cereals, and their benefits have been documented in numerous observational studies (63), but their consumption is also tied with a healthy lifestyle. All the available clinical trials have been of short duration and have produced inconsistent results indicating that the possible benefits are related to the substitution of refined cereals for whole grain cereals, and not because of whole grain cereals per se (64, 65). Our study cannot differentiate between refined and unrefined cereals, but both are highly concentrated sources of carbohydrates (~70–75% weight, ~80–90% energy) and cereals also make up ~50% of CA energy intake in general. To use an analogy with smoking, a switch from unfiltered to filtered cigarettes can reduce health risks, but this fact does not mean that filtered cigarettes should be propagated as part of a healthy lifestyle. In fact, even some unrefined cereals [such as the ‘whole-meal bread’ tested by Bao et al. (32)] have high glycaemic and insulin indices, and the values are often unpredictable. Therefore, in the light of the growing evidence pointing to the negative role of carbohydrates, and considering the lack of any association between saturated fat and CVDs, we are convinced that the current recommendations regarding diet and CVDs should be seriously reconsidered.

Are ‘vegetarians’ or ‘carnivores’ healthier?

Nutrition studies has been plagued with problems. Most of the research in the past was extremely low quality. Few other fields would allow such weak research to be published in peer-reviewed journals. Yet for generations, epidemiological (observational and correlational) studies were the norm for nutrition studies. This kind of research is fine for preliminary exploration in formulating new hypotheses to test, but it is entirely useless for proving or disproving any given hypothesis. Shockingly, almost all of medical advice and government recommendations on diet and nutrition are based on this superficial and misleading level of results.

The main problem is there has been little, if any, control of confounding factors. Also, the comparisons used were pathetically weak. It turns out that, in studies, almost any dietary protocol or change improves health compared to a standard American diet (SAD) or other varieties of standard industrialized diets based on processed foods of refined carbs (particularly wheat), added sugar (particularly high fructose corn syrup), omega-6 seed oils (inflammatory, oxidative, and mutagenic), food additives (from glutamate to propionate), and nutrient-deficient, chemical-drenched agricultural crops (glyphosate among the worst). Assuming the dog got decent food, even eating dog shit would be better for your health than SAD.

Stating that veganism or the Mediterranean diet is healthier than what most people eat really tells us nothing at all. That is even more true when the healthy user effect is not controlled for, as typically is the case with most studies. When comparing people on these diets to typical meat eaters, the ‘carnivores’ also are eating tons of carbs, sugar, and seed oils with their meat (buns, french fries, pop, etc; and, for cooking and in sauces, seed oils; not to mention snacking all day on chips, crackers, cookies, and candy). The average meat-eater consumes far more non-animal foods than animal foods, and most processed junk food is made mostly or entirely with vegan ingredients. So why do the animal foods get all the blame? And why does saturated fat get blamed when, starting back in the 1930s, seed oils replaced lard as the main source of cooking fat/oil?

If scientists in this field were genuinely curious, intellectually humble, not ideologically blinded, and unbiased by big food and big farm funding, they would make honest and fair comparisons to a wide variety of optimally-designed diets. Nutritionists have known about low-carb, keto, and carnivore diets for about a century. The desire to research these diets, however, has been slim to none. The first ever study of the carnivore diet, including fully meat-based, is happening right now. To give some credit, research has slowly been improving. I came across a 2013 study that compared four diets: “vegetarian, carnivorous diet rich in fruits and vegetables, carnivorous diet less rich in meat, and carnivorous diet rich in meat” (Nathalie T. Burkert et al, Nutrition and Health – The Association between Eating Behavior and Various Health Parameters: A Matched Sample Study).

It’s still kind of amusing that the researchers called carnivorous a “diet rich in fruits and vegetables” and a “diet less rich in meat.” If people are mostly eating plant foods or otherwise not eating much meat, how exactly is that carnivorous in any meaningful and practical sense? Only one in four of the diets were carnivorous in the sense the average person would understand it, as a diet largely based on animal foods. Even then, it doesn’t include a carnivorous diet entirely based on animal foods. Those carnivores eating a “diet rich in meat” might still be eating plenty of processed junk food, their meat might still be cooked or slathered in harmful seed oils and come with a bun, and they might still be washing it down with sugary drinks. A McDonald’s Big Mac meal could be considered as part of a diet rich in meat, just because meat represents the greatest portion of weight and calories. Even if their diet was only 5-10% unhealthy plant foods, it could still be doing severe damage to their health. One can fit in a fairly large amount of carbs, seed oils, etc in a relatively small portion of the diet.

I’m reminded of research that defines a “low-carb diet” as any carb intake that is 40% or below, but other studies show that 40% is the absolute highest point of carb intake for most hunter-gatherers. As high and low are relative concepts in defining carb intake, what is considered a meat-rich diet would be relative as well. I doubt these studied carnivorous “diets rich in meat” are including as high amount of animal foods as found in the diets of Inuit, Masai, early Americans, and Paleolithic humans. So what is actually being compared and tested? It’s not clear. This was further confounded in how vegans, vegetarians, and pescetarians (fish-eaters) were combined into a single group mislabeled as ‘vegetarian’, considering that vegetarians and pescetarians technically could eat a diet primarily animal-based if they so chose (dairy, eggs, and/or fish) and I know plenty of vegetarians who eat more cheese than they do fruits and vegetables. Nonetheless, at least these researchers were making a better comparison than most studies. They did try to control for other confounders such as pairing each person on a plant-based diet with “a subject of the same sex, age, and SES [socioeconomic status]” from each of the other three diets.

What were the results? Vegetarians, compared to the most meat-based of the diets, had worse outcomes for numerous health conditions: asthma, allergies, diabetes, cataracts, tinnitus, cardiac infarction, bronchitis, sacrospinal complaints, osteoporosis, gastric or intestinal ulcer, cancer, migraine, mental illness (anxiety disorder or depression), and “other chronic conditions.” There were only a few health conditions where the plant-based dieters fared better. For example, the so-called ‘vegetarians’ had lower rates of hypertension compared to carnivores rich in meat and less rich in meat, although higher rates than those carnivores rich in fruits and vegetables (i.e., more typical omnivores).

This is interesting evidence about the diets, though. If the carnivorous diets were low enough in starchy and sugary plant foods and low enough in dairy, they would be ketogenic which in studies is known to lower blood pressure and so would show a lesser rate of hypertension. This indicates that none of these diets are low-carb, much less very low-carb (ketogenic). The plant-based dieters in this study also had lower rates of stroke and arthritis, these being other health benefits seen on a ketogenic diet, and so this further demonstrates that this study wasn’t comparing high-carb vs low-carb as one might expect from how the diets were described in the paper. That is to say the researchers didn’t include a category for a ketogenic carnivore diet or even a ketogenic omnivore diet, much less a ketogenic ‘vegetarian’ diet as a control. Keep in mind that keto-carnivore is one of the most common forms of those intentionally following a carnivore diet. And keep in mind that plant-based keto is probably more popular right now than keto-carnivore. So, the point is that these unexpected results are examples of the complications with confounding factors.

The only other result that showed an advantage to the ‘vegetarians’ was less urinary incontinence, which simply means they didn’t have to pee as often. I haven’t a clue what that might mean. If we were talking about low-carb and keto, I’d suspect that the increased urination for the ‘carnivorous’ diets was related to decreased water retention (i.e., bloating) and hence the water loss that happens as metabolism shifts toward fat-burning. But since we are confident that such a diet wasn’t included in the study, these results remain anomalous. Of all the things that meat gets blamed for, I’ve never heard of anyone suggesting that it causes most people to urinate incessantly. That is odd. Anyway, it’s not exactly a life-threatening condition, even if it were caused by carnivory. It might have something to do with higher-fat combined with higher-carb, in the way that this combination also contributes to obesity, whereas high-fat/low-carb and low-fat/high-carb does not predispose one to fat gain. The ‘vegetarianism’ in this study was being conflated with a low-fat diet, but all of the four categories apparently were varying degrees of higher carb.

The basic conclusion is that ‘vegetarians’, including vegans and pescetarians, have on average poorer health across the board, with a few possible exceptions. In particular, they suffer more from chronic diseases and report higher impairment from health disorders. Also, not only these ‘vegetarians’ but also meat-eaters who ate a largely plant-based diet (“rich in fruits and vegetables”) consult doctors more often, even as ‘vegetarians’ are inconsistent about preventative healthcare such as check-ups and vaccinations. Furthermore, “subjects with a lower animal fat intake demonstrate worse health care practices,” whatever that exactly means. Generally, ‘vegetarians’ “have a lower quality of life.”

These are interesting results since the researchers were controlling for such things as wealth and poverty, and so it wasn’t an issue of access to healthcare or the quality of one’s environment or level of education. The weakness is that no data was gathered on macronutrient ratios of the subjects’ diets, and no testing was done on micronutrient content in the food and potential deficiencies in the individuals. Based on these results, no conclusions can be made about causal direction and mechanisms, but it does agree with some other research that finds similar results, including with other health conditions such as vegans and vegetarians having greater infertility. Any single one of these results, especially something like infertility, points toward serious health concerns involving deeper systemic disease and disorder within the body.

But what really stands out is the high rate of mental illness among ‘vegetarians’ (about 10%), twice as high as the average meat-eater (about 5%) which is to say the average Westerner, and that is with the background of the Western world having experienced a drastic rise in mental illness over the past couple of centuries. And the only mental illnesses considered in this study were depression and anxiety. The percentage would be so much higher if including all other psychiatric conditions and neurocognitive disorders (personality disorders, psychosis, psychopathy, Alzheimer’s, ADHD, autism, learning disabilities, etc). Think about that, the large number of people on a plant-based diet who are struggling on the most basic level of functioning, something I personally understand from decades of chronic depression on the SAD diet. Would you willingly choose to go on a diet that guaranteed a high probability of causing mental health struggles and suffering, neurocognitive issues and decline?

To put this study in context, listen to what Dr. Paul Saladino, trained in psychiatry and internal medicine, has to say in the following video. Jump to around the 19 minute mark where he goes into the nutritional angle of a carnivore diet. And by carnivore he is talking about fully carnivore and so, if dairy is restricted as he does in his own eating, it would also mean ketogenic as well. A keto-carnivore diet has never been studied. Hopefully, that will change soon. Until then, we have brilliant minds like that of Dr. Saladino to dig into the best evidence that is presently available.

 

Here are a couple of articles that come from the BBC. As a mainstream news source, this demonstrates how this knowledge is finally getting acknowledged in conventional healthcare and public debate. That is heartening.

[Text below is from linked articles.]

Why vegan junk food may be even worse for your health
by William Clark, BBC

There’s also the concern that the health risks associated with these kinds of nutrient deficiencies might not show up immediately. It could take years to associate foggy thoughts and tiredness with low B12 levels, infertility with low iron, and osteoporosis brought on by calcium deficiency does not show up until late 40s and 50s in most people, says Rossi.

“People will think about their health now and not their future health,” she says.

How a vegan diet could affect your intelligence
by Zaria Gorvett, BBC

In fact, there are several important brain nutrients that simply do not exist in plants or fungi. Creatine, carnosine, taurine, EPA and DHA omega-3 (the third kind can be found in plants), haem iron and vitamins B12 and D3 generally only occur naturally in foods derived from animal products, though they can be synthesised in the lab or extracted from non-animal sources such as algae, bacteria or lichen, and added to supplements.

Others are found in vegan foods, but only in meagre amounts; to get the minimum amount of vitamin B6 required each day (1.3 mg) from one of the richest plant sources, potatoes, you’d have to eat about five cups’ worth (equivalent to roughly 750g or 1.6lb). Delicious, but not particularly practical. […]

There are small amounts of choline in lots of vegan staples, but among the richest sources are eggs, beef and seafood. In fact, even with a normal diet, 90% of Americans don’t consume enough. According to unpublished research by Wallace, vegetarians have the lowest intakes of any demographic. “They have extremely low levels of choline, to the point where it might be concerning,” he says.

For vegans, the picture is likely to be bleaker still, since people who eat eggs tend to have almost double the choline levels of those who don’t. And though the US authorities have set suggested intakes, they might be way off.

Meat and mental health: a systematic review of meat abstention and depression, anxiety, and related phenomena
by Urska Dobersek et al

Conclusion: Studies examining the relation between the consumption or avoidance of meat and psychological health varied substantially in methodologic rigor, validity of interpretation, and confidence in results. The majority of studies, and especially the higher quality studies, showed that those who avoided meat consumption had significantly higher rates or risk of depression, anxiety, and/or self-harm behaviors. There was mixed evidence for temporal relations, but study designs and a lack of rigor precluded inferences of causal relations. Our study does not support meat avoidance as a strategy to benefit psychological health.

The Sickness of the Sick Care System

“Today medical schools in the United States offer, on average, only about nineteen hours of nutrition education over four years of medical school. Only 29 percent of U.S. medical schools offer the recommended twenty-five hours of nutrition education. A study in the International Journal of Adolescent Medicine and Health assessed the basic nutrition and health knowledge of medical school graduates entering a pediatric residency program and found that, on average, they answered only 52 percent of eighteen questions correctly. In short, most mainstream doctors would fail nutrition.
~Dr. Will Cole, Ketotarian (quoted here), 2018

Burnout has become an increasing problem among physicians. A recent Medscape survey found high rates of burnout among medical practitioners, including 42% of psychiatrists and mental health professionals. Depression is also extremely common in physicians, who have a suicide rate higher than that of the general population, and even higher than that of other academics. There is also a high suicide rate in psychologists, with some studies suggesting that close to 30% have felt suicidal and nearly 4% have made a suicide attempt. One study of more than 1000 randomly sampled counseling psychologists found that 62% of respondents self-identified as depressed, and of those with depressive symptoms, 42% reported experiencing some form of suicidal ideation or behavior.
~Batya Swift Yasgur, Challenging Stigma: Should Psychiatrists Disclose Their Own Mental Illness?, 2019

“Researchers Rubén Díaz and Carlos Rodríguez, explored the burnout prevalence of mental health professionals in Panama (where I live and work) and found that about 36 percent of its community has suffered from burnout syndrome at one point or another of their careers… While it’s not shocking to learn that mental health professionals also struggle with mental health issues—given that we’re human and all—it’s disconcerting to see research show that mental health care professional are hesitant to seek help. In the aforementioned study, about 43 percent of psychologists “struggle to see the presentation of mental illness and psychological distress within themselves,” and one in five psychologists withholds information about their emotional difficulties.
~Mariana Plata, Therapists Need Therapy, Too, 2018

Probably no single fact illustrates the frequency of this disease [neurasthenia] more impressively than this, that at all times while on duty, I have a number of physicians, who are themselves sufferers in this way, under my care. Many of these medical patients have been affiicted for years, without ever reaching the true diagnosis of the condition, and in not a few instances, the real debility and distress are heightened and intensified by fear of impending disablement. Overworked and overworried physicians are quite apt to develop this disease, and for reasons elsewhere stated… are also more likely to develop at the same time hypochondria or pathophobia. At least one of every ten of those who consult me for neurasthenia are physicians.
~Dr. George Miller Beard, A Practical Treatise On Nervous Exhaustion (Neurasthenia), 1884

“Perhaps he is best known for the establishment of his rest cure, a method of treatment for patients, especially women, who suffered from hysteria and neurasthenia. The cure became the standard treatment for many decades, particularly in England… On a visit to Paris, Mitchell sought out the great Jean Martin Charcot (1825-1893) for help without revealing his name. Where was he from? “Philadelphia?” Then said Charcot: “You should consult Weir Mitchell; he is the best man in America for your kind of trouble.”
~Whonamedit? Biographical Dictionary, Silas Weir Mitchell

“Heard joke once: Man goes to doctor. Says he’s depressed. Says life seems harsh and cruel. Says he feels all alone in a threatening world where what lies ahead is vague and uncertain. Doctor says, “Treatment is simple. Great clown Pagliacci is in town tonight. Go and see him. That should pick you up.” Man bursts into tears. Says, “But doctor…I am Pagliacci.”
~Alan Moore, Watchmen, 1987

Native Americans Feasted Some But Mostly Fasted

“There are to be found, among us, a few strong men and women — the remnant of a by-gone generation, much healthier than our own — who can eat at random, as the savages do, and yet last on, as here and there a savage does, to very advanced years. But these random-shot eaters are, at most, but exceptions to the general rule, which requires regularity.”
~William Andrus Alcott, 1859

Three Squares: The Invention of the American Meal
by Abigail Carroll, pp. 12-14

Encountering the tribal peoples of North America, European explorers and settlers found themselves forced to question an institution they had long taken for granted: the meal. “[They] have no such thing as set meals breakfast, dinner or supper,” remarked explorer John Smith. Instead of eating at three distinct times every day, natives ate when their stomachs cued them, and instead of consuming carefully apportioned servings, they gleaned a little from the pot here and there. English colonists deplored this unstructured approach. They believed in eating according to rules and patterns—standards that separated them from the animal world. But when it came to structure, colonists were hardly in a position to boast. Though they believed in ordered eating, their meals were rather rough around the edges, lacking the kind of organization and form that typifies the modern meal today. Hardly well defined or clean-cut, colonial eating occasions were messy in more ways than one. Perhaps this partially explains why explorers and colonists were so quick to criticize native eating habits—in doing so, they hid the inconsistencies in their own. 3

Colonists found Native American eating habits wanting because they judged them by the European standard. For Europeans, a meal combined contrasting components—usually cereals, vegetables, and animal protein. Heat offered an additional desirable contrast. Swedish traveler Peter Kalm noted that many “meals” consumed by the natives of the mid-Atlantic, where he traveled in the mid-eighteenth century, consisted simply of “[maple] sugar and bread.” With only two ingredients and a distinct lack of protein, not to mention heat, this simplistic combination fell short of European criteria; it was more of a snack. Other typical nonmeals included traveling foods such as nocake (pulverized parched cornmeal to which natives added water on the go) and pemmican (a dense concoction of lean meat, fat, and sometimes dried berries). Hunters, warriors, and migrants relied on these foods, designed to be eaten in that particularly un-meal-like way in which John Williams ate his frozen meat on his journey to Québec: as the stomach required it and on the go. 4

Jerked venison and fat, chewed as one traversed the wilderness, was not most colonists’ idea of a proper meal, and if natives’ lack of sufficient contrasting components and the absence of a formal eating schedule puzzled colonists, even more mystifying was natives’ habit of going without meals, and often without any food at all, for extended periods. Jesuit missionary Christian LeClercq portrayed the Micmac of the Gaspé Peninsula in Canada as a slothful people, preserving and storing only a token winter’s supply: “They are convinced that fifteen to twenty lumps of meat, or of fish dried or cured in the smoke, are more than enough to support them for the space of five to six months.” LeClercq and many others did not realize that if natives went hungry, they did so not from neglect but by choice. Fasting was a subsistence strategy, and Native Americans were proud of it. 5

Throughout the year, Native Americans prepared for times of dearth by honing their fasting skills. They practiced hunger as a kind of athletic exercise, conditioning their bodies for the hardships of hunting, war, and seasonal shortages. According to artist George Catlin, the Mandan males in what are now the Dakotas “studiously avoided . . . every kind of excess.” An anthropologist among the Iroquois observed that they were “not great eaters” and “seldom gorged themselves.” To discourage gluttony, they even threatened their children with a visit from Sago’dakwus, a mythical monster that would humiliate them if it caught them in the act of overeating. 6

Native and European approaches to eating came to a head in the vice of gluttony. Many tribal peoples condemned overeating as a spiritual offense and a practice sure to weaken manly resolve and corrupt good character. Europeans also condemned it, largely for religious reasons, but more fundamentally because it represented a loss of control over the animal instincts. In the European worldview, overindulgence was precisely the opposite of civility, and the institution of the meal guarded against gluttony and a slippery descent into savagery. The meal gave order to and set boundaries around the act of eating, boundaries that Europeans felt native practices lacked. As explorers and colonists defended the tradition of the meal, the institution took on new meaning. For them, it became a subject of pride, serving as an emblem of civilization and a badge of European identity. 7

Europeans viewed Native Americans largely as gluttons. Because whites caught only fleeting glimpses of the complex and continually shifting lives of Native Americans, they were liable to portray the native way of life according to a single cultural snapshot, which, when it came to food, was the posthunt feast. It was well known that natives ate much and frequently during times of abundance. John Smith recorded that when natives returned from the hunt with large quantities of bear, venison, and oil, they would “make way with their provision as quick as possible.” For a short time, he explained, “they have plenty and do not spare eating.” White witnesses popularized the image of just such moments of plenty as typical. 8

Although Native Americans were hardly gluttons, Europeans, fascinated by the idea of a primitive people with a childlike lack of restraint, embraced the grossly inaccurate stereotype of the overeating Indian. William Wood portrayed the natives of southern New England as gorging themselves “till their bellies stand forth, ready to split with fullness.” A decidedly strange Anglo-American amusement involved watching Native Americans relish a meal. “Why,” asked George Catlin, “[is it] that hundreds of white folks will flock and crowd round a table to see an Indian eat?” With a hint of disappointment, William Wood recorded the appetites of tribes people invited to an English house to dine as “very moderate.” Wood was uncertain whether to interpret this reserve as politeness or timidity, but clearly he and his fellow English spectators had not expected shy and tempered eaters. 9

One culture’s perception of another often says more about the perceiver than the perceived. Although settlers lambasted natives for gluttony, whites may have been the real gluttons. According to more than one observer, many a native blushed at Europeans’ bottomless stomachs. “The large appetites of white men who visited them were often a matter of surprise to the Indians who entertained them,” wrote a nineteenth-century folklorist among the Iroquois. Early anthropologist Lewis Morgan concluded that natives required only about one-fifth of what white men consumed, and he was skeptical of his own ability to survive on such a paucity of provisions. 10

Through their criticisms, exaggerations, and stereotypes, colonists distanced themselves from a population whose ways appeared savage and unenlightened, and the organized meal provided a touchstone in this clash of cultures. It became a yardstick by which Europeans measured culture and a weapon by which they defended their definition of it. They had long known what a meal was, but now, by contrast, they knew firsthand what it was not. Encountering the perceived meal-less-ness of the natives brought the colonists’ esteemed tradition into question and gave them an opportunity to confirm their commitment to their conventions. They refused to approve of, let alone adapt to, the loose foodways of Native Americans and instead embraced all the more heartily a structured, meal-centered European approach to eating.

Old Debates Forgotten

Since earlier last year, I’ve done extensive reading, largely but not entirely focused on health. This has particularly concerned diet and nutrition, although it has crossed over into the territory of mental health with neurocognitive issues, addiction, autism, and much else, with my personal concern being that of depression. The point of this post is to consider some of the historical background. Before I get to that, let me explain how my recent interests have developed.

What got me heading in this direction was the documentary The Magic Pill. It’s about the paleo diet. The practical advice was worth the time spent, though other things drew me into the the larger arena of low-carb debate. The thing about the paleo diet is that it offers a framework of understanding that includes many scientific fields involving health beyond only diet and also it explores historical records, anthropological research, and archaeological evidence. The paleo diet community in particular, along with the low-carb diet community in general, is also influenced by the traditional foods approach of Sally Fallon Morrell. She is the lady who, more than anyone else, popularized the work of Weston A. Price, an early 20th century dentist who traveled the world and studied traditional populations. I was already familiar with this area from having reading Morrell’s first book in the late ’90s or early aughts.

New to me was the writings of Gary Taubes and Nina Teicholz, two science journalists who have helped to shift the paradigm in nutritional studies. They accomplished this task by presenting not only detailed surveys of the research and other evidence but in further contextualizing the history of powerful figures, institutions, and organizations that shaped the modern industrial diet. I didn’t realize how far back this debate went with writings on fasting for epilepsy found in ancient texts and recommendations of a low-carb diet (apparently ketogenic) for diabetes appearing in the 1790s, along with various low-carb and animal-based diets being popularized for weight-loss and general health during the 19th century, and then the ketogenic diet was studied for epilepsy beginning in the 1920s. Yet few know this history.

Ancel Keys was one of those powerful figures who, in suppressing his critics and silencing debate, effectively advocated for the standard American diet of high-carbs, grains, fruits, vegetables, and industrial seed oils. In The Magic Pill, more recent context is given in following the South African trial of Tim Noakes. Other documentaries have covered this kind of material, often with interviews with Gary Taubes and Nina Teicholz. There has been immense drama involved and, in the past, there was also much public disagreement and discussion. Only now is that returning to mainstream awareness in the corporate media, largely because social media has forced it out into the open. But what interests me is how old is the debate and often in the past much more lively.

The post-revolutionary era created a sense of crisis that, by the mid-19th century, was becoming a moral panic. The culture wars were taking shape. The difference back then was that there was much more of a sense of the connection between physical health, mental health, moral health, and societal health. As a broad understanding, health was seen as key and this was informed by the developing scientific consciousness and free speech movement. The hunger for knowledge was hard to suppress, although there were many attempts as the century went on. I tried to give a sense of this period in two massive posts, The Crisis of Identity and The Agricultural Mind. It’s hard to imagine what that must’ve been like. That scientific debate and public debate was largely shut down around the World War era, as the oppressive Cold War era took over. Why?

It is strange. The work of Taubes and Teicholz gives hint to what changed, although the original debate was much wider than diet and nutrition. The info I’ve found about the past has largely come from scholarship in other fields, such as historical and literary studies. Those older lines of thought are mostly treated as historical curiosities at this point, background info for the analysis of entirely other subjects. As for the majority of scientists, doctors and nutritionists these days, they are almost entirely ignorant of the ideologies that shaped modern thought about disease and health.

This is seen, as I point out, in how Galen’s ancient Greek theory of humors as incorporated into Medieval Christianity appears to be the direct source of the basic arguments for a plant-based diet, specifically in terms of the scapegoating of red meat, saturated fat and cholesterol. Among what I’ve come across, the one scholarly book that covers this in detail is Food and Faith in Christian Culture edited by Ken Albala and Trudy Eden. Bringing that into present times, Belinda Fettke dug up how so much of contemporary nutritional studies and dietary advice was built on the foundation of 19th-20th century vegan advocacy by the Seventh Day Adventists. I’ve never met anyone adhering to “plant-based” ideology who knows this history. Yet now it is becoming common knowledge in the low-carb world.

On the literary end of things, there is a fascinating work by Bryan Kozlowski, The Jane Austen Diet. I enjoyed reading it, in spite of never having cracked open a book by Jane Austen. Kozlowski, although no scholar, was able to dredge up much of interest about those post-revolutionary decades in British society. For one, he shows how obesity was becoming noticeable all the way back then and many were aware of the benefits of low-carb diets. He also makes clear that the ability to maintain a vegetable garden was a sign of immense wealth, not a means for putting much food on the tables of the poor — this is corroborated by Teicholz discussion of how gardening in American society, prior to modern technology and chemicals, was difficult and not dependable. More importantly, Kozlowski’s book explains what ‘sensibility’ meant back then, related to ‘nerves’ and ‘vapors’ and later on given the more scientific-sounding label of ‘neurasthenia’.

I came across another literary example of historical exegesis about health and diet, Sander L. Gilman’s Franz Kafka, the Jewish Patient. Kafka was an interesting case, as a lifelong hypochondriac who, it turns out, had good reason to be. He felt that he had inherited a weak constitution and blamed this on his psychological troubles, but more likely causes were urbanization, industrialization, and a vegetarian diet that probably also was a high-carb diet based on nutrient-depleted processed foods; and before the time when industrial foods were fortified and many nutritional supplements were available.

What was most educational, though, about the text was Gilman’s historical details on tuberculosis in European thought, specifically in relationship to Jews. To some extent, Kafka had internalized racial ideology and that is unsurprising. Eugenics was in the air and racial ideology penetrated everything, especially health in terms of racial hygiene. Even for those who weren’t eugenicists, all debate of that era was marked by the expected biases and limitations. Some theorizing was better than others and for certain not all of it was racist, but the entire debate maybe was tainted by the events that would follow. With the defeat of the Nazis, eugenics fell out of favor for obvious reasons and an entire era of debate was silenced, even many of the arguments that were opposed to or separate form eugenics. Then historical amnesia set in, as many people wanted to forget the past and instead focus on the future. That was unfortunate. The past doesn’t simply disappear but continues to haunt us.

That earlier debate was a struggle between explanations and narratives. With modernity fully taking hold, people wanted to understand what was happening to humanity and where it was heading. It was a time of contrasts which made the consequences of modernity quite stark. There were plenty of communities that were still pre-industrial, rural, and traditional, but since then most of these communities have died away. The diseases of civilization, at this point, have become increasingly normalized as living memory of anything else has disappeared. It’s not that the desire for ideological explanations has disappeared. What happened was, with the victory of WWII, a particular grand narrative came to dominate the entire Western world and there simply were no other grand narratives to compete with it. Much of the pre-war debate and even scientific knowledge, especially in Europe, was forgotten as the records of it were destroyed, weren’t translated, or lost perceived relevance.

Nonetheless, all of those old ideological conflicts were left unresolved. The concerns then are still concerns now. So many problems worried about back then are getting worse. The connections between various aspects of health have regained their old sense of urgency. The public is once again challenging authorities, questioning received truths, and seeking new meaning. The debate never ended and here we are again, and one could add that fascism also is back rearing its ugly head. It’s worrisome that the political left seems to be slow on the uptake. There are reactionary right-wingers like Jordan Peterson who are offering visions of meaning and also who have become significant figures in the dietary world, by way of the carnivore diet he and his daughter are on. T?hen there are the conspiratorial paleo-libertarians such as Tristan Haggard, another carnivore advocate.

This is far from being limited to carnivory and the low-carb community includes those across the political spectrum, but it seems to be the right-wingers who are speaking the loudest. The left-wingers who are speaking out on diet come from the confluence of veganism/vegetarianism and environmentalism, as seen with EAT-Lancet (Dietary Dictocrats of EAT-Lancet). The problem with this, besides much of this narrative being false (Carnivore is Vegan), is that it is disconnected from the past. The right-wing is speaking more to the past than is the left-wing, such as Trump’s ability to invoke and combine the Populist and Progressive rhetoric from earlier last century. The political left is struggling to keep up and is being led down ideological dead-ends.

If we want to understand our situation now, we better study carefully what was happening in centuries past. We are having the same old debates without realizing it and we very well might see them lead to the same kinds of unhappy results.

Moral Panic and Physical Degeneration

From the beginning of the country, there has been an American fear of moral and mental decline that was always rooted in the physical, involving issues of vitality of land and health of the body, and built on an ancient divide between the urban and rural. Over time, it grew into a fever pitch of moral panic about degeneration and degradation of the WASP culture, the white race, and maybe civilization itself. Some saw the end was near, maybe being able to hold out for another few generations before finally succumbing to disease and weakness. The need for revitalization and rebirth became a collective project (Jackson Lears, Rebirth of a Nation), which sadly fed into ethno-nationalist bigotry and imperialistic war-mongering — Make America Great Again!

A major point of crisis, of course, was the the Civil War. Racial ideology became predominant, not only because of slavery but maybe moreso because of mass immigration, the latter being the main reason the North won. Racial tensions merged with the developing scientific mindset of Darwinism and out of this mix came eugenics. For all we can now dismiss this kind of simplistic ignorance and with hindsight see the danger it led to, the underlying anxieties were real. Urbanization and industrialization were having an obvious impact on public health that was observed by many, and it wasn’t limited to mere physical ailments. “Cancer, like insanity, seems to increase with the progress of civilization,” noted Stanislas Tanchou, a mid-19th century French physician.

The diseases of civilization, including mental sickness, have been spreading for centuries (millennia, actually, considering the ‘modern’ chronic health conditions were first detected in the mummies of the agricultural Egyptians). Consider how talk of depression suddenly showed up in written accounts with the ending of feudalism (Barbara Ehrenreich, Dancing in the Street). That era included the enclosure movement that forced millions of then landless serfs into the desperate conditions of crowded cities and colonies where they faced stress, hunger, malnutrition, and disease. The loss of rural life hit Europe much earlier than America, but it eventually came here as well. The majority of white Americans were urban by the beginning of the 20th century and the majority of black Americans were urban by the 1970s. There has been a consistent pattern of mass problems following urbanization, everywhere it happens. It still is happening. The younger generation, more urbanized than any generation before, are seeing rising rates of psychosis that is specifically concentrated in the most urbanized areas.

In the United States, it was the last decades of the 19th century that was the turning point, the period of the first truly big cities. Into this milieu, Weston A. Price was born (1870) in a small rural village in Canada. As an adult, he became a dentist and sought work in Cleveland, Ohio (1893). Initially, most of his patients probably had, like him, grown up in rural areas. But over the decades, he increasingly was exposed to the younger generations having spent their entire lives in the city. Lierre Keith puts Price’s early observations in context, after pointing out that he started his career in 1893: “This date is important, as he entered the field just prior to the glut of industrial food. Over the course of the next thirty years, he watched children’s dentition — and indeed their overall health deteriorate. There was suddenly children whose teeth didn’t fit in their mouths, children with foreshortened jaws, children with lots of cavities. Not only were their dental arches too small, but he noticed their nasal passages were also too narrow, and they had poor health overall; asthma, allergies, behavioral problems” (The Vegetarian Myth, p. 187). This was at the time when the industrialization of farming and food had reached a new level, far beyond the limited availability of canned foods that in the mid-to-late 1800s when most Americans still relied on a heavy amount of wild-sourced meat, fish, nuts, etc. Even city-dwellers in early America had ready access to wild game because of the abundance of surrounding wilderness areas. In fact, in the 19th century, the average American ate more meat (mostly hunted) than bread.

We are once again coming back to the ever recurrent moral panic about the civilizational project. The same fears given voice in the late 19th to early 20th century are being repeated again. For example, Dr. Leonard Sax alerts us to how girls are sexually maturing early (1% of female infants showing signs of puberty), whereas boys are maturing later. As a comparison, hunter-gatherers don’t have such a large gender disparity of puberty nor do they experience puberty so early for girls, instead both genders typically coming to puberty around 18 years old with sex, pregnancy, and marriage happening more or less simultaneously. Dr. Sax, along with others, speculates about a number of reasons. Common causes that are held responsible include health factors, from diet to chemicals. Beyond altered puberty, many other examples could be added: heart disease, autoimmune disorders, mood disorders, autism, ADHD, etc; all of them increasing and worsening with each generation (e.g., type 2 diabetes used to be known as adult onset diabetes but now is regularly diagnosed in young children; the youngest victim recorded recently was three years old when diagnosed).

In the past, Americans responded to moral panic with genocide of Native Americans, Prohibition targeting ethnic (hyphenated) Americans and the poor, and immigrant restrictions to keep the bad sort out; the spread of racism and vigilantism such as KKK and Jim Crow and sundown towns and redlining, forced assimilation such as English only laws and public schools, and internment camps for not only Japanese-Americans but also German-Americans and Italian-Americans; implementation of citizen-making projects like national park systems, Boy Scouts, WPA, and CCC; promotion of eugenics, war on poverty (i.e., war on the poor), imperial expansionism, neo-colonial exploitation, and world wars; et cetera. The cure sought was often something to be forced onto the population by a paternalistic elite, that is to say rich white males, most specifically WASPs of the capitalist class.

Eugenics was, of course, one of the main focuses as it carried the stamp of science (or rather scientism). Yet at the same time, there were those challenging biological determinism and race realism, as views shifted toward environmental explanations. The anthropologists were at the front lines of this battle, but there were also Social Christians who changed their minds after having seen poverty firsthand. Weston A. Price, however, didn’t come to this from a consciously ideological position or religious motivation. He was simply a dentist who couldn’t ignore the severe health issues of his patients. So, he decided to travel the world in order to find healthy populations to study, in the hope of explaining why the change had occurred (Nutrition and Physical Degeneration).

Although familiar with eugenics literature, what Price observed in ‘primitive’ communities (including isolated villages in Europe) did not conform to eugenicist thought. It didn’t matter which population he looked at. Those who ate traditional diets were healthy and those who ate an industrialized Western diet were not. And it was a broad pattern that he saw everywhere he went, not only physical health but also neurocognitive health as indicated by happiness, low anxiety, and moral character. Instead of blaming individuals or races, he saw the common explanation as nutrition and he made a strong case by scientifically analyzing the nutrition of available foods.

In reading about traditional foods, paleo diet/lifestyle and functional medicine, Price’s work comes up quite often. He took many photographs that compared people from healthy and unhealthy populations. The contrast is stark. But what really stands out is how few people in the modern world look close to as healthy as those from the healthiest societies of the past. I live in a fairly wealthy college and medical town where there is a far above average concern for health along with access to healthcare. Even so, I now can’t help noticing how many people around me show signs of stunted or perturbed development of the exact kind Price observed in great detail: thin bone structure, sunken chests, sloping shoulders, narrow facial features, asymmetry, etc. That is even with modern healthcare correcting some of the worst conditions: cavities, underbites, pigeon-toes, etc. My fellow residents in this town are among the most privileged people in the world and, nonetheless, their state of health is a sad state of affairs in what it says about humanity at present.

It makes me wonder, as it made Price wonder, what consequences this has on neurocognitive health for individuals and the moral health of society. Taken alone, it isn’t enough to get excited about. But put in a larger context of looming catastrophes and it does become concerning. It’s not clear that our health will be up to the task of the problems we need to solve. We are a sickly population, far more sickly than when moral panic took hold in past generations.

As important, there is the personal component. I’m at a point where I’m not going to worry too much about decline and maybe collapse of civilization. I’m kind of hoping the American Empire will meet its demise. Still, that leaves us with many who suffer, no matter what happens to society as a whole. I take that personally, as one who has struggled with physical and mental health issues. And I’ve come around to Price’s view of nutrition as being key. I see these problems in other members of my family and it saddens me to watch as health conditions seem to get worse from one generation to the next.

It’s far from being a new problem, the central point I’m trying to make here. Talking to my mother, she has a clear sense of the differences on the two sides of her family. Her mother’s family came from rural areas and, even after moving to a larger city for work, they continued to hunt on a daily basis as there were nearby fields and woods that made that possible. They were a healthy, happy, and hard-working lot. They got along well as a family. Her father’s side of the family was far different. They had been living in towns and cities for several generations by the time she was born. They didn’t hunt at all. They were known for being surly, holding grudges, and being mean drunks. They also had underbites (i.e., underdeveloped jaw structure) and seemed to have had learning disabilities, though no one was diagnosing such conditions back then. Related to this difference, my mother’s father raised rabbits whereas my mother’s mother’s family hunted rabbits (and other wild game). This makes a big difference in terms of nutrition, as wild game has higher levels of omega-3 fatty acids and fat-soluble vitamins, all of which are key to optimal health and development.

What my mother observed in her family is basically the same as what Price observed in hundreds of communities in multiple countries on every continent. And I now observe the same pattern repeating. I grew up with an underbite. My brothers and I all required orthodontic work, as do so many now. I was diagnosed with a learning disability when young. Maybe not a learning disability, but behavioral issues were apparent when my oldest brother was young, likely related to his mildew allergies and probably an underlying autoimmune condition. I know I had food allergies as a child, as I think my other brother did as well. All of us have had neurocognitive and psychological issues of a fair diversity, besides learning disabilities: stuttering, depression, anxiety, and maybe some Asperger’s.

Now another generation is coming along with increasing rates of major physical and mental health issues. My nieces and nephews are sick all the time. They don’t eat well and are probably malnourished. During a medical checkup for my nephew, my mother asked the doctor about his extremely unhealthy diet, consisting mostly of white bread and sugar. The doctor bizarrely dismissed it as ‘normal’ in that, as she claimed, no kid eats healthy. If that is the new normal, maybe we should be in a moral panic.

* * *

Violent Behavior: A Solution in Plain Sight
by Sylvia Onusic

Nutrition and Mental Development
by Sally Fallon Morell

You Are What You Eat: The Research and Legacy of Dr. Weston Andrew Price
by John Larabell

While practicing in his Cleveland office, Dr. Price noticed an increase in dental problems among the younger generations. These issues included the obvious dental caries (cavities) as well as improper jaw development leading to crowded, crooked teeth. In fact, the relatively new orthodontics industry was at that time beginning to gain popularity. Perplexed by these modern problems that seemed to be affecting a greater and greater portion of the population, Dr. Price set about to research the issue by examining people who did not display such problems. He suspected (correctly, as he would later find) that many of the dental problems, as well as other degenerative health problems, that were plaguing modern society were the result of inadequate nutrition owing to the increasing use of refined, processed foods.

Nasty, Brutish and Short?
by Sally Fallon Morell

It seems as if the twentieth century will exit with a crescendo of disease. Things were not so bad back in the 1930’s, but the situation was already serious enough to cause one Cleveland, Ohio dentist to be concerned. Dr. Weston Price was reluctant to accept the conditions exhibited by his patients as normal. Rarely did an examination of an adult patient reveal anything but rampant decay, often accompanied by serious problems elsewhere in the body, such as arthritis, osteoporosis, diabetes, intestinal complaints and chronic fatigue. (They called it neurasthenia in Price’s day.) But it was the dentition of younger patients that alarmed him most. Price observed that crowded, crooked teeth were becoming more and more common, along with what he called “facial deformities”-overbites, narrowed faces, underdevelopment of the nose, lack of well-defined cheekbones and pinched nostrils. Such children invariably suffered from one or more complaints that sound all too familiar to mothers of the 1990’s: frequent infections, allergies, anemia, asthma, poor vision, lack of coordination, fatigue and behavioral problems. Price did not believe that such “physical degeneration” was God’s plan for mankind. He was rather inclined to believe that the Creator intended physical perfection for all human beings, and that children should grow up free of ailments.

Is it Mental or is it Dental?
by Raymond Silkman

The widely held model of orthodontics, which considers developmental problems in the jaws and head to be genetic in origin, never made sense to me. Since they are wedded to the genetic model, orthodontists dealing with crowded teeth end up treating the condition with tooth extraction in a majority of the cases. Even though I did not resort to pulling teeth in my practice, and I was using appliances to widen the jaws and getting the craniums to look as they should, I still could not come up with the answer as to why my patients looked the way they did. I couldn’t believe that the Creator had given them a terrible blueprint –it just did not make sense. In four years of college education, four years of dental school education and almost three years of post-graduate orthodontic training, students never hear a mention of Dr. Price, so they never learn the true reasons for these malformations. I have had the opportunity to work with a lot of very knowledgeable doctors in various fields of allopathic and alternative healthcare who still do not know about Dr. Price and his critical findings.

These knowledgeable doctors have not stared in awe at the beautiful facial development that Price captured in the photographs he took of primitive peoples throughout the globe and in so doing was able to answer this most important question: What do humans look like in health? And how have humans been able to carry on throughout history and populate such varied geographical and physical environments on the earth without our modern machines and tools?

The answer that Dr. Price was able to illuminate came through his photographs of beautiful, healthy human beings with magnificent physical form and mental development, living in harmony with their environments. […]

People who are not well oxygenated and who have poor posture often suffer from fatigue and fibromyalgia symptoms, they snore and have sleep apnea, they have sinusitis and frequent ear infections. Life becomes psychologically and physically challenging for them and they end up with long-term dependence on medications—and all of that just from the seemingly simple condition of crowded teeth.

In other words, people with poor facial development are not going to live very happily. […]

While very few people have heard of the work of Weston Price these days, we haven’t lost our ability to recognize proper facial form. To make it in today’s society, you must have good facial development. You’re not going to see a general or a president with a weak chin, you’re not going to see coaches with weak chins, you’re not going to see a lot of well-to-do personalities in the media with underdeveloped faces and chins. You don’t see athletes and newscasters with narrow palates and crooked teeth.

Weston A. Price: An Unorthodox Dentist
by Nourishing Israel

Price discovered that the native foods eaten by the isolated populations were far more nutrient dense than the modern foods. In the first generation that changed their diet there was noticeable tooth decay; in subsequent generations the dental and facial bone structure changed, as well as other changes that were seen in American and European families and previously considered to be the result of interracial marriage.

By studying the different routes that the same populations had taken – traditional versus modern diet – he saw that the health of the children is directly related to the health of the parents and the germ plasms that they provide, and are as important to the child’s makeup as the health of the mother before and during pregnancy.

Price also found that primitive populations were very conscious of the importance of the mothers’ health and many populations made sure that girls were given a special diet for several months before they were allowed to marry.

Another interesting finding was that although genetic makeup was important, it did not have as great a degree of influence on a person’s development and health as was thought, but that a lot of individual characteristics, including brain development and brain function, where due to environmental influence, what he called “intercepted heredity”.

The origin of personality and character appear in the light of the newer date to be biologic products and to a much less degree than usually considered pure hereditary traits. Since these various factors are biologic, being directly related to both the nutrition of the parents and to the nutritional environment of the individuals in the formative and growth period any common contributing factor such as food deficiencies due to soil depletion will be seen to produce degeneration of the masses of people due to a common cause. Mass behavior therefore, in this new light becomes the result of natural forces, the expression of which may not be modified by propaganda but will require correction at the source. [1] …

It will be easy for the reader to be prejudiced since many of the applications suggested are not orthodox. I suggest that conclusions be deferred until the new approach has been used to survey the physical and mental status of the reader’s own family, of his brothers and sisters, of associated families, and finally, of the mass of people met in business and on the street. Almost everyone who studies the matter will be surprised that such clear-cut evidence of a decline in modern reproductive efficiency could be all about us and not have been previously noted and reviewed.[2]

From Nutrition and Physical Degeneration by Weston Price

Food Freedom – Nourishing Raw Milk
by Lisa Virtue

In 1931 Price visited the people of the Loetschental Valley in the Swiss Alps. Their diet consisted of rye bread, milk, cheese and butter, including meat once a week (Price, 25). The milk was collected from pastured cows, and was consumed raw: unpasteurized, unhomogenized (Schmid, 9).

Price described these people as having “stalwart physical development and high moral character…superior types of manhood, womanhood and childhood that Nature has been able to produce from a suitable diet and…environment” (Price, 29). At this time, Tuberculosis had taken more lives in Switzerland than any other disease. The Swiss government ordered an inspection of the valley, revealing not a single case. No deaths had been recorded from Tuberculosis in the history of the Loetschental people (Shmid, 8). Upon return home, Price had dairy samples from the valley sent to him throughout the year. These samples were higher in minerals and vitamins than samples from commercial (thus pasteurized) dairy products in America and the rest of Europe. The Loetschental milk was particularly high in fat soluble vitamin D (Schmid, 9).

The daily intake of calcium and phosphorous, as well as fat soluble vitamins would have been higher than average North American children. These children were strong and sturdy, playing barefoot in the glacial waters into the late chilly evenings. Of all the children in the valley eating primitive foods, cavities were detected at an average of 0.3 per child (Price, 25). This without visiting a dentist or physician, for the valley had none, seeing as there was no need (Price, 23). To offer some perspective, the rate of cavities per child between the ages of 6-19 in the United States has been recorded to be 3.25, over 10 times the rate seen in Loetschental (Nagel).

Price offers some perspective on a society subsisting mainly on raw dairy products: “One immediately wonders if there is not something in the life-giving vitamins and minerals of the food that builds not only great physical structures within which their souls reside, but builds minds and hearts capable of a higher type of manhood…” (Price, 26).

100 Years Before Weston Price
by Nancy Henderson

Like Price, Catlin was struck by the beauty, strength and demeanor of the Native Americans. “The several tribes of Indians inhabiting the regions of the Upper Missouri. . . are undoubtedly the finest looking, best equipped, and most beautifully costumed of any on the Continent.” Writing of the Blackfoot and Crow, tribes who hunted buffalo on the rich glaciated soils of the American plains, “They are the happiest races of Indian I have met—picturesque and handsome, almost beyond description.”

“The very use of the word savage,” wrote Catlin, “as it is applied in its general sense, I am inclined to believe is an abuse of the word, and the people to whom it is applied.” […]

As did Weston A. Price one hundred years later, Catlin noted the fact that moral and physical degeneration came together with the advent of civilized society. In his late 1830s portrait of “Pigeon’s Egg Head (The Light) Going to and Returning from Washington” Catlin painted him corrupted with “gifts of the great white father” upon his return to his native homeland. Those gifts including two bottles of whiskey in his pockets. […]

Like Price, Catlin discusses the issue of heredity versus environment. “No diseases are natural,” he writes, “and deformities, mental and physical, are neither hereditary nor natural, but purely the result of accidents or habits.”

So wrote Dr. Price: “Neither heredity nor environment alone cause our juvenile delinquents and mental defectives. They are cripples, physically, mentally and morally, which could have and should have been prevented by adequate education and by adequate parental nutrition. Their protoplasm was not normally organized.”

The Right Price
by Weston A. Price Foundation

Many commentators have criticized Price for attributing “decline in moral character” to malnutrition. But it is important to realize that the subject of “moral character” was very much on the minds of commentators of his day. As with changes in facial structure, observers in the first half of the 20th century blamed “badness” in people to race mixing, or to genetic defects. Price quotes A.C. Jacobson, author of a 1926 publication entitled Genius (Some Revaluations),35 who stated that “The Jekyll-Hydes of our common life are ethnic hybrids.” Said Jacobson, “Aside from the effects of environment, it may safely be assumed that when two strains of blood will not mix well a kind of ‘molecular insult’ occurs which the biologists may some day be able to detect beforehand, just as blood is now tested and matched for transfusion.” The implied conclusion to this assertion is that “degenerates” can be identified through genetic testing and “weeded out” by sterilizing the unfit–something that was imposed on many women during the period and endorsed by powerful individuals, including Oliver Wendell Holmes.

It is greatly to Price’s credit that he objected to this arrogant point of view: “Most current interpretations are fatalistic and leave practically no escape from our succession of modern physical, mental and moral cripples. . . If our modern degeneration were largely the result of incompatible racial stocks as indicated by these premises, the outlook would be gloomy in the extreme.”36 Price argued that nutritional deficiencies affecting the physical structure of the body can also affect the brain and nervous system; and that while “bad” character may be the result of many influences–poverty, upbringing, displacement, etc.–good nutrition also plays a role in creating a society of cheerful, compassionate individuals.36

Rebirth of a Nation:
The Making of Modern America, 1877-1920
By Jackson Lears
pp. 7-9

By the late nineteenth century, dreams of rebirth were acquiring new meanings. Republican moralists going back to Jefferson’s time had long fretted about “overcivilization,” but the word took on sharper meaning among the middle and upper classes in the later decades of the nineteenth century. During the postwar decades, “overcivilization” became not merely a social but an individual condition, with a psychiatric diagnosis. In American Nervousness (1880), the neurologist George Miller Beard identified “neurasthenia,” or “lack of nerve force,” as the disease of the age. Neurasthenia encompassed a bewildering variety of symptoms (dyspepsia, insomnia, nocturnal emissions, tooth decay, “fear of responsibility, of open places or closed places, fear of society, fear of being alone, fear of fears, fear of contamination, fear of everything, deficient mental control, lack of decision in trifling matters, hopelessness”), but they all pointed to a single overriding effect: a paralysis of the will.

The malady identified by Beard was an extreme version of a broader cultural malaise—a growing sense that the Protestant ethic of disciplined achievement had reached the end of its tether, had become entangled in the structures of an increasingly organized capitalist society. Ralph Waldo Emerson unwittingly predicted the fin de siècle situation. “Every spirit makes its house,” he wrote in “Fate” (1851), “but afterwards the house confines the spirit.” The statement presciently summarized the history of nineteenth-century industrial capitalism, on both sides of the Atlantic.

By 1904, the German sociologist Max Weber could put Emerson’s proposition more precisely. The Protestant ethic of disciplined work for godly ends had created an “iron cage” of organizations dedicated to the mass production and distribution of worldly goods, Weber argued. The individual striver was caught in a trap of his own making. The movement from farm to factory and office, and from physical labor outdoors to sedentary work indoors, meant that more Europeans and North Americans were insulated from primary processes of making and growing. They were also caught up in subtle cultural changes—the softening of Protestantism into platitudes; the growing suspicion that familiar moral prescriptions had become mere desiccated, arbitrary social conventions. With the decline of Christianity, the German philosopher Friedrich Nietzsche wrote, “it will seem for a time as though all things had become weightless.”

Alarmists saw these tendencies as symptoms of moral degeneration. But a more common reaction was a diffuse but powerful feeling among the middle and upper classes—a sense that they had somehow lost contact with the palpitating actuality of “real life.” The phrase acquired unprecedented emotional freight during the years around the turn of the century, when reality became something to be pursued rather than simply experienced. This was another key moment in the history of longing, a swerve toward the secular. Longings for this-worldly regeneration intensified when people with Protestant habits of mind (if not Protestant beliefs) confronted a novel cultural situation: a sense that their way of life was being stifled by its own success.

On both sides of the Atlantic, the drive to recapture “real life” took myriad cultural forms. It animated popular psychotherapy and municipal reform as well as avant-garde art and literature, but its chief institutional expression was regeneration through military force. As J. A. Hobson observed in Imperialism (1902), the vicarious identification with war energized jingoism and militarism. By the early twentieth century, in many minds, war (or the fantasy of it) had become the way to keep men morally and physically fit. The rise of total war between the Civil War and World War I was rooted in longings for release from bourgeois normality into a realm of heroic struggle. This was the desperate anxiety, the yearning for rebirth, that lay behind official ideologies of romantic nationalism, imperial progress, and civilizing mission—and that led to the trenches of the Western Front.

Americans were immersed in this turmoil in peculiarly American ways. As the historian Richard Slotkin has brilliantly shown, since the early colonial era a faith in regeneration through violence underlay the mythos of the American frontier. With the closing of the frontier (announced by the U.S. census in 1890), violence turned outward, toward empire. But there was more going on than the refashioning of frontier mythology. American longings for renewal continued to be shaped by persistent evangelical traditions, and overshadowed by the shattering experience of the Civil War. American seekers merged Protestant dreams of spiritual rebirth with secular projects of purification—cleansing the body politic of secessionist treason during the war and political corruption afterward, reasserting elite power against restive farmers and workers, taming capital in the name of the public good, reviving individual and national vitality by banning the use of alcohol, granting women the right to vote, disenfranchising African-Americans, restricting the flow of immigrants, and acquiring an overseas empire.

Of course not all these goals were compatible. Advocates of various versions of rebirth—bodybuilders and Prohibitionists, Populists and Progressives, Social Christians and Imperialists—all laid claims to legitimacy. Their crusades met various ends, but overall they relieved the disease of the fin de siècle by injecting some visceral vitality into a modern culture that had seemed brittle and about to collapse. Yearning for intense experience, many seekers celebrated Force and Energy as ends in themselves. Such celebrations could reinforce militarist fantasies but could also lead in more interesting directions—toward new pathways in literature and the arts and sciences. Knowledge could be revitalized, too. William James, as well as Houdini and Roosevelt, was a symbol of the age.

The most popular forms of regeneration had a moral dimension.

pp. 27-29

But for many other observers, too many American youths—especially among the upper classes—had succumbed to the vices of commerce: the worship of Mammon, the love of ease. Since the Founding Fathers’ generation, republican ideologues had fretted about the corrupting effects of commercial life. Norton and other moralists, North and South, had imagined war would provide an antidote. During the Gilded Age those fears acquired a peculiarly palpable intensity. The specter of “overcivilization”—invoked by republican orators since Jefferson’s time—developed a sharper focus: the figure of the overcivilized businessman became a stock figure in social criticism. Flabby, ineffectual, anxious, possibly even neurasthenic, he embodied bourgeois vulnerability to the new challenges posed by restive, angry workers and waves of strange new immigrants. “Is American Stamina Declining?” asked William Blaikie, a former Harvard athlete and author of How to Get Strong and Stay So, in Harper’s in 1889. Among white-collar “brain-workers,” legions of worried observers were asking similar questions. Throughout the country, metropolitan life for the comfortable classes was becoming a staid indoor affair. Blaikie caught the larger contours of the change:

“A hundred years ago, there was more done to make our men and women hale and vigorous than there is to-day. Over eighty per cent of all our men then were farming, hunting, or fishing, rising early, out all day in the pure, bracing air, giving many muscles very active work, eating wholesome food, retiring early, and so laying in a good stock of vitality and health. But now hardly forty per cent are farmers, and nearly all the rest are at callings—mercantile, mechanical, or professional—which do almost nothing to make one sturdy and enduring.”

This was the sort of anxiety that set men (and more than a few women) to pedaling about on bicycles, lifting weights, and in general pursuing fitness with unprecedented zeal. But for most Americans, fitness was not merely a matter of physical strength. What was equally essential was character, which they defined as adherence to Protestant morality. Body and soul would be saved together.

This was not a gender-neutral project. Since the antebellum era, purveyors of conventional wisdom had assigned respectable women a certain fragility. So the emerging sense of physical vulnerability was especially novel and threatening to men. Manliness, always an issue in Victorian culture, had by the 1880s become an obsession. Older elements of moral character continued to define the manly man, but a new emphasis on physical vitality began to assert itself as well. Concern about the over-soft socialization of the young promoted the popularity of college athletics. During the 1880s, waves of muscular Christianity began to wash over campuses.

pp. 63-71

NOT MANY AMERICAN men, even among the comparatively prosperous classes, were as able as Carnegie and Rockefeller to master the tensions at the core of their culture. Success manuals acknowledged the persistent problem of indiscipline, the need to channel passion to productive ends. Often the language of advice literature was sexually charged. In The Imperial Highway (1881), Jerome Bates advised:

[K]eep cool, have your resources well in hand, and reserve your strength until the proper time arrives to exert it. There is hardly any trait of character or faculty of intellect more valuable than the power of self-possession, or presence of mind. The man who is always “going off” unexpectedly, like an old rusty firearm, who is easily fluttered and discomposed at the appearance of some unforeseen emergency; who has no control over himself or his powers, is just the one who is always in trouble and is never successful or happy.

The assumptions behind this language are fascinating and important to an understanding of middle-and upper-class Americans in the Gilded Age. Like many other purveyors of conventional wisdom—ministers, physicians, journalists, health reformers—authors of self-help books assumed a psychic economy of scarcity. For men, this broad consensus of popular psychology had sexual implications: the scarce resource in question was seminal fluid, and one had best not be diddling it away in masturbation or even nocturnal emissions. This was easier said than done, of course, as Bates indicated, since men were constantly addled by insatiable urges, always on the verge of losing self-control—the struggle to keep it was an endless battle with one’s own darker self. Spiritual, psychic, and physical health converged. What Freud called “‘civilized’ sexual morality” fed directly into the “precious bodily fluids” school of health management. The man who was always “‘going off’ unexpectedly, like an old rusty firearm,” would probably be sickly as well as unsuccessful—sallow, sunken-chested, afflicted by languorous indecision (which was how Victorian health literature depicted the typical victim of what was called “self-abuse”).

But as this profile of the chronic masturbator suggests, scarcity psychology had implications beyond familiar admonitions to sexual restraint. Sexual scarcity was part of a broader psychology of scarcity; the need to conserve semen was only the most insistently physical part of a much more capacious need to conserve psychic energy. As Bates advised, the cultivation of “self-possession” allowed you to “keep your resources well in hand, and reserve your strength until the proper time arrives to exert it.” The implication was that there was only so much strength available to meet demanding circumstances and achieve success in life. The rhetoric of “self-possession” had financial as well as sexual connotations. To preserve a cool, unruffled presence of mind (to emulate Rockefeller, in effect) was one way to stay afloat on the storm surges of the business cycle.

The object of this exercise, at least for men, was personal autonomy—the ownership of one’s self. […]

It was one thing to lament excessive wants among the working class, who were supposed to be cultivating contentment with their lot, and quite another to find the same fault among the middle class, who were supposed to be improving themselves. The critique of middle-class desire posed potentially subversive questions about the dynamic of dissatisfaction at the core of market culture, about the very possibility of sustaining a stable sense of self in a society given over to perpetual jostling for personal advantage. The ruinous results of status-striving led advocates of economic thrift to advocate psychic thrift as well.

By the 1880s, the need to conserve scarce psychic resources was a commonly voiced priority among the educated and affluent. Beard’s American Nervousness had identified “the chief and primary cause” of neurasthenia as “modern civilization,” which placed unprecedented demands on limited emotional energy. “Neurasthenia” and “nervous prostration” became catchall terms for a constellation of symptoms that today would be characterized as signs of chronic depression—anxiety, irritability, nameless fears, listlessness, loss of will. In a Protestant culture, where effective exercise of will was the key to individual selfhood, the neurasthenic was a kind of anti-self—at best a walking shadow, at worst a bedridden invalid unable to make the most trivial choices or decisions. Beard and his colleagues—neurologists, psychiatrists, and self-help writers in the popular press—all agreed that nervous prostration was the price of progress, a signal that the psychic circuitry of “brain workers” was overloaded by the demands of “modern civilization.”

While some diagnoses of this disease deployed electrical metaphors, the more common idiom was economic. Popular psychology, like popular economics, was based on assumptions of scarcity: there was only so much emotional energy (and only so much money) to go around. The most prudent strategy was the husbanding of one’s resources as a hedge against bankruptcy and breakdown. […]

Being reborn through a self-allowed regime of lassitude was idiosyncratic, though important as a limiting case. Few Americans had the leisure or the inclination to engage in this kind of Wordsworthian retreat. Most considered neurasthenia at best a temporary respite, at worst an ordeal. They strained, if ambivalently, to be back in harness.

The manic-depressive psychology of the business class mimicked the lurching ups and downs of the business cycle. In both cases, assumptions of scarcity underwrote a pervasive defensiveness, a circle-the-wagons mentality. This was the attitude that lay behind the “rest cure” devised by the psychiatrist Silas Weir Mitchell, who proposed to “fatten” and “redden” the (usually female) patient by isolating her from all mental and social stimulation. (This nearly drove the writer Charlotte Perkins Gilman crazy, and inspired her story “The Yellow Wallpaper.”) It was also the attitude that lay behind the fiscal conservatism of the “sound-money men” on Wall Street and in Washington—the bankers and bondholders who wanted to restrict the money supply by tying it to the gold standard. Among the middle and upper classes, psyche and economy alike were haunted by the common specter of scarcity. But there were many Americans for whom scarcity was a more palpable threat.

AT THE BOTTOM of the heap were the urban poor. To middle-class observers they seemed little more than a squalid mass jammed into tenements that were festering hives of “relapsing fever,” a strange malady that left its survivors depleted of strength and unable to work. The disease was “the most efficient recruiting officer pauperism ever had,” said a journalist investigating tenement life in the 1870s. Studies of “the nether side of New York” had been appearing for decades, but—in the young United States at least—never before the Gilded Age had the story of Dives and Lazarus been so dramatically played out, never before had wealth been so flagrant, or poverty been so widespread and so unavoidably appalling. The army of thin young “sewing-girls” trooping off in the icy dawn to sweatshops all over Manhattan, the legions of skilled mechanics forced by high New York rents to huddle with their families amid a crowd of lowlifes, left without even a pretense of privacy in noisome tenements that made a mockery of the Victorian cult of home—these populations began to weigh on the bourgeois imagination, creating concrete images of the worthy, working poor.

pp. 99-110

Racial animosities flared in an atmosphere of multicultural fluidity, economic scarcity, and sexual rivalry. Attitudes arising from visceral hostility acquired a veneer of scientific objectivity. Race theory was nothing new, but in the late nineteenth century it mutated into multiple forms, many of them characterized by manic urgency, sexual hysteria, and biological determinism. Taxonomists had been trying to arrange various peoples in accordance with skull shape and brain size for decades; popularized notions of natural selection accelerated the taxonomic project, investing it more deeply in anatomical details. The superiority of the Anglo-Saxon—according to John Fiske, the leading pop-evolutionary thinker—arose not only from the huge size of his brain, but also from the depth of its furrows and the plenitude of its creases. The most exalted mental events had humble somatic origins. Mind was embedded in body, and both could be passed on to the next generation.

The year 1877 marked a crucial development in this hereditarian synthesis: in that year, Richard Dugdale published the results of his investigation into the Juke family, a dull-witted crew that had produced more than its share of criminals and mental defectives. While he allowed for the influence of environment, Dugdale emphasized the importance of inherited traits in the Juke family. If mental and emotional traits could be inherited along with physical ones, then why couldn’t superior people be bred like superior dogs or horses? The dream of creating a science of eugenics, dedicated to improving and eventually even perfecting human beings, fired the reform imagination for decades. Eugenics was a kind of secular millennialism, a vision of a society where biological engineering complemented social engineering to create a managerial utopia. The intellectual respectability of eugenics, which lasted until the 1930s, when it became associated with Nazism, underscores the centrality of racialist thinking among Americans who considered themselves enlightened and progressive. Here as elsewhere, racism and modernity were twinned.

Consciousness of race increasingly pervaded American culture in the Gilded Age. Even a worldview as supple as Henry James’s revealed its moorings in conventional racial categories when, in The American (1877), James presented his protagonist, Christopher Newman, as a quintessential Anglo-Saxon but with echoes of the noble Red Man, with the same classical posture and physiognomy. There was an emerging kinship between these two groups of claimants to the title “first Americans.” The iconic American, from this view, was a blend of Anglo-Saxon refinement and native vigor. While James only hints at this, in less than a generation such younger novelists as Frank Norris and Jack London would openly celebrate the rude vitality of the contemporary Anglo-Saxon, proud descendant of the “white savages” who subdued a continent. It should come as no surprise that their heroes were always emphatically male. The rhetoric of race merged with a broader agenda of masculine revitalization.[…]

By the 1880s, muscular Christians were sweeping across the land, seeking to meld spiritual and physical renewal, establishing institutions like the Young Men’s Christian Association. The YMCA provided prayer meetings and Bible study to earnest young men with spiritual seekers’ yearnings, gyms and swimming pools to pasty young men with office workers’ midriffs. Sometimes they were the same young men. More than any other organization, the YMCA aimed to promote the symmetry of character embodied in the phrase “body, mind, spirit”—which a Y executive named Luther Gulick plucked from Deuteronomy and made the motto of the organization. The key to the Y’s appeal, a Harper’s contributor wrote in 1882, was the “overmastering conviction” of its members: “The world always respects manliness, even when it is not convinced [by theological argument]; and if the organizations did not sponsor that quality in young men, they would be entitled to no respect.” In the YMCA, manliness was officially joined to a larger agenda.

For many American Protestants, the pursuit of physical fitness merged with an encompassing vision of moral and cultural revitalization—one based on the reassertion of Protestant self-control against the threats posed to it by immigrant masses and mass-marketed temptation. […]

Science and religion seemed to point in the same direction: Progress and Providence were one.

Yet the synthesis remained precarious. Physical prowess, the basis of national supremacy, could not be taken for granted. Strong acknowledged in passing that Anglo-Saxons could be “devitalized by alcohol and tobacco.” Racial superiority could be undone by degenerate habits. Even the most triumphalist tracts contained an undercurrent of anxiety, rooted in the fear of flab. The new stress on the physical basis of identity began subtly to undermine the Protestant synthesis, to reinforce the suspicion that religion was a refuge for effeminate weaklings. The question inevitably arose, in some men’s minds: What if the YMCA and muscular Christianity were not enough to revitalize tired businessmen and college boys?

Under pressure from proliferating ideas of racial “fitness,” models of manhood became more secular. Despite the efforts of muscular Christians to reunite body and soul, the ideal man emerging among all classes by the 1890s was tougher and less introspective than his mid-Victorian predecessors. He was also less religious. Among advocates of revitalization, words like “Energy” and “Force” began to dominate discussion—often capitalized, often uncoupled from any larger frameworks of moral or spiritual meaning, and often combined with racist assumptions. […]

The emerging worship of force raised disturbing issues. Conventional morality took a backseat to the celebration of savage strength. After 1900, in the work of a pop-Nietzschean like Jack London, even criminality became a sign of racial vitality: as one of his characters says, “We whites have been land-robbers and sea-robbers from remotest time. It is in our blood, I guess, and we can’t get away from it.” This reversal of norms did not directly challenge racial hierarchies, but the assumptions behind it led toward disturbing questions. If physical prowess was the mark of racial superiority, what was one to make of the magnificent specimens of manhood produced by allegedly inferior races? Could it be that desk-bound Anglo-Saxons required an infusion of barbarian blood (or at least the “barbarian virtues” recommended by Theodore Roosevelt)? Behind these questions lay a primitivist model of regeneration, to be accomplished by incorporating the vitality of the vanquished, dark-skinned other. The question was how to do that and maintain racial purity.

pp. 135-138

Yet to emphasize the gap between country and the city was not simply an evasive exercise: dreams of bucolic stillness or urban energy stemmed from motives more complex than mere escapist sentiment. City and country were mother lodes of metaphor, sources for making sense of the urban-industrial revolution that was transforming the American countryside and creating a deep sense of discontinuity in many Americans’ lives during the decades after the Civil War. If the city epitomized the attraction of the future, the country embodied the pull of the past. For all those who had moved to town in search of excitement or opportunity, rural life was ineluctably associated with childhood and memory. The contrast between country and city was about personal experience as well as political economy. […]

REVERENCE FOR THE man of the soil was rooted in the republican tradition. In his Notes on the State of Virginia (1785), Jefferson articulated the antithesis that became central to agrarian politics (and to the producerist worldview in general)—the contrast between rural producers and urban parasites. “Those who labour in the earth are the chosen people of God, if ever he had a chosen people, whose breasts he has made his peculiar deposit for substantial and genuine virtue,” he announced. “Corruption of morals in the mass of cultivators is a phenomenon of which no age nor nation has furnished an example. It is the mark set on those, who not looking up to heaven, to their own soil and industry, as does the husbandman, for their subsistence, depend for it on the casualties and caprice of customers. Dependence begets subservience and venality, suffocates the germ of virtue, and prepares fit tools for the design of ambition.” Small wonder, from this view, that urban centers of commerce seemed to menace the public good. “The mobs of great cities,” Jefferson concluded, “add just so much to the support of pure government as sores do to the strength of the human body.” Jefferson’s invidious distinctions echoed through the nineteenth century, fueling the moral passion of agrarian rebels. Watson, among many, considered himself a Jeffersonian.

There were fundamental contradictions embedded in Jefferson’s conceptions of an independent yeomanry. Outside certain remote areas in New England, most American farmers were not self-sufficient in the nineteenth century—nor did they want to be. Many were eager participants in the agricultural market economy, animated by a restless, entrepreneurial spirit. Indeed, Jefferson’s own expansionist policies, especially the Louisiana Purchase, encouraged centrifugal movement as much as permanent settlement. “What developed in America,” the historian Richard Hofstadter wrote, “was an agricultural society whose real attachment was not to the land but to land values.” The figure of the independent yeoman, furnishing enough food for himself and his family, participating in the public life of a secure community—this icon embodied longings for stability amid a maelstrom of migration.

Often the longings were tinged with a melancholy sense of loss. […] For those with Jeffersonian sympathies, abandoned farms were disturbing evidence of cultural decline. As a North American Review contributor wrote in 1888: “Once let the human race be cut off from personal contact with the soil, once let the conventionalities and artificial restrictions of so-called civilization interfere with the healthful simplicity of nature, and decay is certain.” Romantic nature-worship had flourished fitfully among intellectuals since Emerson had become a transparent eye-ball on the Concord common and Whitman had loafed among leaves of grass. By the post–Civil War decades, romantic sentiment combined with republican tradition to foster forebodings. Migration from country to city, from this view, was a symptom of disease in the body politic. Yet the migration continued. Indeed, nostalgia for rural roots was itself a product of rootlessness. A restless spirit, born of necessity and desire, spun Americans off in many directions—but mainly westward. The vision of a stable yeomanry was undercut by the prevalence of the westering pioneer.

pp. 246-247

Whether energy came from within or without, it was as limitless as electricity apparently was. The obstacles to access were not material—class barriers or economic deprivation were never mentioned by devotees of abundance psychology—they were mental and emotional. The most debilitating emotion was fear, which cropped up constantly as the core problem in diagnoses of neurasthenia. The preoccupation with freeing oneself from internal constraints undermined the older, static ideal of economic self-control at its psychological base. As one observer noted in 1902: “The root cause of thrift, which we all admire and preach because it is so convenient to the community, is fear, fear of future want; and that fear, we are convinced, when indulged overmuch by pessimist minds is the most frequent cause of miserliness….” Freedom from fear meant freedom to consume.

And consumption began at the dinner table. Woods Hutchinson claimed in 1913 that the new enthusiasm for calories was entirely appropriate to a mobile, democratic society. The old “stagnation” theory of diet merely sought to maintain the level of health and vigor; it was a diet for slaves or serfs, for people who were not supposed to rise above their station. “The new diet theory is based on the idea of progress, of continuous improvement, of never resting satisfied with things as they are,” Hutchinson wrote. “No diet is too liberal or expensive that will…yield good returns on the investment.” Economic metaphors for health began to focus on growth and process rather than stability, on consumption and investment rather than savings.

As abundance psychology spread, a new atmosphere of dynamism enveloped old prescriptions for success. After the turn of the century, money was less often seen as an inert commodity, to be gradually accumulated and tended to steady growth; and more often seen as a fluid and dynamic force. To Americans enraptured by the strenuous life, energy became an end itself—and money was a kind of energy. Success mythology reflected this subtle change. In the magazine hagiographies of business titans—as well as in the fiction of writers like Dreiser and Norris—the key to success frequently became a mastery of Force (as those novelists always capitalized it), of raw power. Norris’s The Pit (1903) was a paean to the furious economic energies concentrated in Chicago. “It was Empire, the restless subjugation of all this central world of the lakes and prairies. Here, mid-most in the land, beat the Heart of the nation, whence inevitably must come its immeasurable power, its infinite, inexhaustible vitality. Here of all her cities, throbbed the true life—the true power and spirit of America: gigantic, crude, with the crudity of youth, disdaining rivalry; sane and healthy and vigorous; brutal in its ambition, arrogant in the new-found knowledge of its giant strength, prodigal of its wealth, infinite in its desires.” This was the vitalist vision at its most breathless and jejune, the literary equivalent of Theodore Roosevelt’s adolescent antics.

The new emphasis on capital as Force translated the psychology of abundance into economic terms. The economist who did the most to popularize this translation was Simon Nelson Patten, whose The New Basis of Civilization (1907) argued that the United States had passed from an “era of scarcity” to an “era of abundance” characterized by the unprecedented availability of mass-produced goods. His argument was based on the confident assumption that human beings had learned to control the weather. “The Secretary of Agriculture recently declared that serious crop failures will occur no more,” Patten wrote. “Stable, progressive farming controls the terror, disorder, and devastation of earlier times. A new agriculture means a new civilization.” Visions of perpetual growth were in the air, promising both stability and dynamism.

The economist Edward Atkinson pointed the way to a new synthesis with a hymn to “mental energy” in the Popular Science Monthly. Like other forms of energy, it was limitless. “If…there is no conceivable limit to the power of mind over matter or to the number of conversions of force that can be developed,” he wrote, “it follows that pauperism is due to want of mental energy, not of material resources.” Redistribution of wealth was not on the agenda; positive thinking was.

pp. 282-283

TR’s policies were primarily designed to protect American corporations’ access to raw materials, investment opportunities, and sometimes markets. The timing was appropriate. In the wake of the merger wave of 1897–1903, Wall Street generated new pools of capital, while Washington provided new places to invest it. Speculative excitement seized many among the middle and upper classes who began buying stocks for the first time. Prosperity spread even among the working classes, leading Simon Nelson Patten to detect a seismic shift from an era of scarcity to an era of abundance. For him, a well-paid working population committed to ever-expanding consumption would create what he called The New Basis of Civilization (1907).

Patten understood that the mountains of newly available goods were in part the spoils of empire, but he dissolved imperial power relations in a rhetoric of technological determinism. The new abundance, he argued, depended not only on the conquest of weather but also on the annihilation of time and space—a fast, efficient distribution system that provided Americans with the most varied diet in the world, transforming what had once been luxuries into staples of even the working man’s diet. “Rapid distribution of food carries civilization with it, and the prosperity that gives us a Panama canal with which to reach untouched tropic riches is a distinctive laborer’s resource, ranking with refrigerated express and quick freight carriage.” The specific moves that led to the seizure of the Canal Zone evaporated in the abstract “prosperity that gives us a Panama Canal,” which in turn became as much a boon to the workingman as innovative transportation. Empire was everywhere, in Patten’s formulation, and yet nowhere in sight.

What Patten implied (rather than stated overtly) was that imperialism underwrote expanding mass consumption, raising standards of living for ordinary folk. “Tropic riches” became cheap foods for the masses. The once-exotic banana was now sold from pushcarts for 6 cents a dozen, “a permanent addition to the laborer’s fund of goods.” The same was true of “sugar, which years ago was too expensive to be lavishly consumed by the well-to-do,” but “now freely gives its heat to the workingman,” as Patten wrote. “The demand that will follow the developing taste for it can be met by the vast quantities latent in Porto Rico and Cuba, and beyond them by the teeming lands of South America, and beyond them by the virgin tropics of another hemisphere.” From this view, the relation between empire and consumption was reciprocal: if imperial policies helped stimulate consumer demand, consumer demand in turn promoted imperial expansion. A society committed to ever-higher levels of mass-produced abundance required empire to be a way of life.

Dr. Catherine Shanahan On Dietary Epigenetics and Mutations

Dr. Catherine Shanahan is a board-certified family physician with an undergraduate degree in biology, along with training in biochemistry and genetics. She has also studied ethno-botany, culinary traditions, and ancestral health. Besides regularly appearing in and writing for national media, she has worked as director and nutrition consultant for the Los Angeles Lakers. On High Intensity Health, she was interviewed by nutritionist Mike Mutzel (Fat Adapted Athletes Perform Better). At the 31:55 mark in that video, she discussed diet (in particular, industrial vegetable oils or simply seed oils), epigenetic inheritance, de novo genetic mutations, and autism. This can be found in the show notes (#172) where it is stated that,

“In 1909 we consumed 1/3 of an ounce of soy oil per year. Now we consume about 22 pounds per year. In the amounts that we consume seed oils, it breaks down into some of the worst toxins ever discovered. They are also capable of damaging our DNA. Many diseases are due to mutations that children have that their parents did not have. This means that mothers and fathers with poor diets have eggs/sperm that have mutated DNA. Children with autism have 10 times the number of usual mutations in their genes. Getting off of seed oils is one of the most impactful things prospective parents can do. The sperm has more mutations than the egg.”

These seed oils didn’t exist in the human diet until the industrial era. Our bodies are designed to use and incorporate the PUFAs from natural sources, but the processing into oils through high pressure and chemicals denatures the structure of the oil and destroys the antioxidants. The oxidative stress that follows from adding them to the diet is precisely because these altered oils act as trojan horses in being treated by the body like natural fats. This is magnified by a general increase of PUFAs, specifically omega-6 fatty acids, with a simultaneous decrease of omega-3 fatty acids and saturated fats. It isn’t any difference in overall fat intake, as the 40% we get in the diet now is about the same as seen in the diet at the beginning of last century. What is different is these oxidized PUFAs combined with massive loads of sugar and starches like never seen before.

Dr. Shanahan sees these industrial plant oils as the single greatest harm, such that she doesn’t consider them to be a food but a toxin, originally discovered as an industrial byproduct. She is less worried about any given category of food or macronutrient, as long as you first and foremost remove this specific source of toxins.** She goes into greater detail in a talk from Ancestry Foundation (AHS16 – Cate Shanahan – Bad Diet, Bad DNA?). And her book, Deep Nutrition, is a great resource on this topic. I’ll leave that for you to further explore, if you so desire. Let me quickly and simply note an implication of this.

Genetic mutations demonstrates how serious of a situation this is. The harm we are causing ourselves might go beyond merely punishment for our personal sins but the sins of the father and mother genetically passing onto their children, grandchildren, and further on (one generation of starvation or smoking among grandparents leads to generations of smaller birth weight and underdevelopment among the grandchildren and maybe beyond, no matter if the intervening generation of parents was healthy).

It might not be limited to a temporary transgenerational harm as seen with epigenetics. This could be permanent harm to our entire civilization, fundamentally altering our collective gene pool. We could recover from epigenetics within a few generations, assuming we took the problem seriously and acted immediately (Dietary Health Across Generations), but with genetic mutations we may never be able to undo the damage. These mutations have been accumulating and will continue to accumulate, until we return to an ancestral diet of healthy foods as part of an overall healthy lifestyle and environment. Even mutations can be moderated by epigenetics, as the body is designed to deal with them.

This further undermines genetic determinism and biological essentialism. We aren’t mere victims doomed to a fate beyond our control. This dire situation is being created by all of us, individually and collectively. There is no better place to begin than with your own health, but we better also treat this as a societal crisis verging on catastrophe. It was public policies and an international food system that created the conditions that enacted and enforced this failed mass experiment of dietary dogma and capitalist realist profiteering. Maybe we could try something different, something  less psychopathically authoritarian, less psychotically disconnected from reality, less collectively suicidal. Heck, it’s worth a try.

* * *

** I’d slightly disagree with her emphasis. She thinks what matters most is the changes over the past century. There is a good point made in this focus on late modernity. But I’d note that industrialization and modern agriculture began in the prior centuries.

It was in the colonial era that pasta was introduced to Italy, potatoes to Ireland, and sugar throughout the Western world. It wasn’t until the late 1700s and more clearly in the early 1800s that there were regular grain surpluses that made grains available for feeding/fattening both humans and cattle. In particular, it was around this time that agricultural methods improved for wheat crops, allowing it to be affordable to the general public for the first time in human existence and hence causing white bread to become common during the ensuing generations.

I don’t know about diseases like Alzheimer’s, Parkinson’s, and multiple sclerosis. But I do know that the most major diseases of civilization (obesity, diabetes, cancer, and mental illness) were first noticed to be on the rise during the 1700s and 1800s or sometimes earlier, long before industrial oils or the industrial revolution that made these oils possible. The high-carb diet appeared gradually with colonial trade and spread across numerous societies, first hitting the wealthiest before eventually being made possible for the dirty masses. During this time, it was observed by doctors, scientists, missionaries and explorers that obesity, diabetes, cancer, mental illness and moral decline quickly followed on the heels of this modern diet.

Seed oils were simply the final Jenga block pulled out from the ever growing and ever more wobbly tower, in replacing healthy nutrient-dense animal fats (full of fat-soluble vitamins, choline, omega-3 fatty acids, etc) that were counterbalancing some of the worst effects of the high-carb diet. But seed oils, as with farm chemicals such as glyphosate, never would never have had as severe and dramatic of an impact if not for the previous centuries of worsening diet and health. It had been building up over a long time and it was doomed to topple right from the start. We are simply now at the tipping point that is bringing us to the culmination point, the inevitable conclusion of a sad trajectory.

Still, it’s never too late… or let us hope. Dr. Shanahan prefers to end on an optimistic note. And I’d rather not disagree with her about that. I’ll assume she is right or that she is at least in the general ballpark. Let us do as she suggests. We need more and better research, but somehow industrial seed oils have slipped past the notice of autism researchers.

* * *

On Deep Nutrition and Genetic Expression
interview by Kristen Michaelis CNC

Dr. Cate: Genetic Wealth is the idea that if your parents or grandparents ate traditional and nutrient-rich foods, then you came into the world with genes that could express in an optimal way, and this makes you more likely to look like a supermodel and be an extraordinary athlete. Take Angelina Jolie or Michael Jordan, for instance. They’ve got loads of genetic wealth.

Genetic Momentum
 describes the fact that, once you have that extraordinary genetic wealth, you don’t have to eat so great to be healthier than the average person. It’s like being born into a kind of royalty. You always have that inheritance around and you don’t need to work at your health in the same way other people do.

These days, for most of us, it was our grandparents or great grandparents who were the last in our line to grow up on a farm or get a nutrient-rich diet. In my case, I have to go back 4 generations to the Irish and Russian farmers who immigrated to NYC where my grandparents on both sides could only eat cheap food; sometimes good things like chopped liver and beef tongue, but often preserves and crackers and other junk. So my grandparents were far healthier than my brother and sisters and I.

The Standard American Diet (SAD) has accelerated the processes of genetic wealth being spent down, genetic momentum petering out, and the current generation getting sick earlier than their parents and grandparents. This is a real, extreme tragedy on the order of end-of-the-world level losses of natural resources. Genetic wealth is a kind of natural resource. And loss of genetic wealth is a more urgent problem than peak oil or the bursting of the housing bubble. But of course nobody is talking about it directly, only indirectly, in terms of increased rates of chronic disease.

Take autism, for example. Why is autism so common? I don’t think vaccines are the reason for the vast vast majority of cases, since subtle signs of autism can be seen before vaccination in the majority. I think the reason has to do with loss of genetic wealth. We know that children with autism exhibit DNA mutations that their parents and grandparents did not have. Why? Because in the absence of necessary nutrients, DNA cannot even duplicate itself properly and permanent mutations develop.

(Here’s an article on one kind of genetic mutation (DNA deletions) associated with autism.)

Fortunately, most disease is not due to permanent letter mutations and therefore a good diet can rehabilitate a lot of genetic disease that is only a result of altered genetic expression. To put your high-school biology to work, it’s the idea of genotype versus phenotype. You might have the genes that make you prone to, for example, breast cancer (the BRCA1 mutation), but you might not get the disease if you eat right because the gene expression can revert back to normal.

Deep Nutrition: Why Your Genes Need Traditional Food
by Dr. Catherine Shanahan
pp. 55-57

Guided Evolution?

In 2007, a consortium of geneticists investigating autism boldly announced that the disease was not genetic in the typical sense of the word, meaning that you inherit a gene for autism from one or both of your parents. New gene sequencing technologies had revealed that many children with autism had new gene mutations, never before expressed in their family line.

An article published in the prestigious journal Proceedings of the National Academy of Sciences states, “The majority of autisms are a result of de novo mutations, occurring first in the parental germ line.” 42 The reasons behind this will be discussed in Chapter 9.

In 2012, a group investigating these new, spontaneous mutations discovered evidence that randomness was not the sole driving force behind them. Their study, published in the journal Cell, revealed an unexpected pattern of mutations occurring 100 times more often in specific “hotspots,” regions of the human genome where the DNA strand is tightly coiled around organizing proteins called histones that function much like spools in a sewing kit, which organize different colors and types of threads. 43

The consequences of these mutations seem specifically designed to toggle up or down specific character traits. Jonathan Sebat, lead author on the 2012 article, suggests that the hotspots are engineered to “mutate in ways that will influence human traits” by toggling up or down the development of specific behaviors. For example, when a certain gene located at a hotspot on chromosome 7 is duplicated, children develop autism, a developmental delay characterized by near total lack of interest in social interaction. When the same chromosome is deleted, children develop Williams Syndrome, a developmental delay characterized by an exuberant gregariousness, where children talk a lot, and talk with pretty much anyone. The phenomenon wherein specific traits are toggled up and down by variations in gene expression has recently been recognized as a result of the built-in architecture of DNA and dubbed “active adaptive evolution.” 44

As further evidence of an underlying logic driving the development of these new autism-related mutations, it appears that epigenetic factors activate the hotspot, particularly a kind of epigenetic tagging called methylation. 45 In the absence of adequate B vitamins, specific areas of the gene lose these methylation tags, exposing sections of DNA to the factors that generate new mutations. In other words, factors missing from a parent’s diet trigger the genome to respond in ways that will hopefully enable the offspring to cope with the new nutritional environment. It doesn’t always work out, of course, but that seems to be the intent.

You could almost see it as the attempt to adjust character traits in a way that will engineer different kinds of creative minds, so that hopefully one will give us a new capacity to adapt.

pp. 221-228

What Is Autism?

The very first diagnostic manual for psychiatric disorders published in 1954 described autism simply as “schizophrenic reaction, childhood type.” 391 The next manual, released in 1980, listed more specific criteria, including “pervasive lack of responsiveness to other people” and “if speech is present, peculiar speech patterns such as immediate and delayed echolalia, metaphorical language, pronominal reversal (using you when meaning me, for instance).” 392 Of course, the terse language of a diagnostic manual can never convey the real experience of living with a child on the spectrum, or living on the spectrum yourself.

When I graduated from medical school, autism was so rarely diagnosed that none of my psychiatry exams even covered it and I and my classmates were made aware of autism more from watching the movie Rain Man than from studying course material. The question of whether autism (now commonly referred to as ASD) is more common now than it was then or whether we are simply recognizing it more often is still controversial. Some literature suggests that it is a diagnostic issue, and that language disorders are being diagnosed less often as autism is being diagnosed more. However, according to new CDC statistics, it appears that autism rates have risen 30 percent between 2008 and 2012. Considering that diagnostic criteria had been stable by that point in time for over a decade, increased diagnosis is unlikely to be a major factor in this 30 percent figure. 393

Given these chilling statistics, it’s little wonder that so many research dollars have been dedicated to exploring possible connections between exposure to various environmental factors and development of the disorder. Investigators have received grants to look into a possible link between autism and vaccines, 394 smoking, 395 maternal drug use (prescription and illicit), 396 , 397 , 398 organophosphates, 399 and other pesticides, 400 BPA, 401 lead, 402 mercury, 403 cell phones, 404 IVF and infertility treatments, 405 induced labor, 406 high-powered electric wires, 407 flame retardants, 408 ultrasound, 409 —and just about any other environmental factor you can name. You might be wondering if they’ve also looked into diet. But of course: alcohol, 410 cow’s milk, 411 milk protein, 412 soy formula, 413 gluten, 414 and food colorings 415 have all been investigated. Guess what they’ve never dedicated a single study to investigating? Here’s a hint: it’s known to be pro-oxidative and pro-inflammatory and contains 4-HNE, 4-HHE, and MDA, along with a number of other equally potent mutagens. 416 Still haven’t guessed? Okay, one last hint: it’s so ubiquitous in our food supply that for many Americans it makes up as much as 60 percent of their daily caloric intake, 417 a consumption rate that has increased in parallel with rising rates of autism.

Of course, I’m talking about vegetable oil. In Chapter 2 , I discussed in some detail how and why gene transcription, maintenance, and expression are necessarily imperiled in the context of a pro-inflammatory, pro-oxidative environment, so I won’t go further into that here. But I do want to better acquaint you with the three PUFA-derived mutagens I just named because when they make it to the part of your cell that houses DNA, they can bind to DNA and create new, “de novo,” mutations. DNA mutations affecting a woman’s ovaries, a man’s sperm, or a fertilized embryo can have a devastating impact on subsequent generations.

First, let’s revisit 4-HNE (4-hydroxynonanol), which you may recall meeting in the above section on firebombing the highways. This is perhaps the most notorious of all the toxic fats derived from oxidation of omega-6 fatty acids, whose diversity of toxic effects requires that entire chemistry journals be devoted to 4-HNE alone. When the mutagenicity (ability to mutate DNA) of 4-HNE was first described in 1985, the cytotoxicity (ability to kill cells) had already been established for decades. The authors of a 2009 review article explain that the reason it had taken so long to recognize that HNE was such an effective carcinogen was largely due to the fact that “the cytotoxicity [cell-killing ability] of 4-HNE masked its genotoxicity [DNA-mutating effect].” 419 In other words, it kills cells so readily that they don’t have a chance to divide and mutate. How potently does 4-HNE damage human DNA? After interacting with DNA, 4-HNE forms a compound called an HNE-adduct, and that adduct prevents DNA from copying itself accurately. Every time 4-HNE binds to a guanosine (the G of the four-letter ACGT DNA alphabet), there is somewhere between a 0.5 and 5 percent chance that G will not be copied correctly, and that the enzyme trying to make a perfect copy of DNA will accidentally turn G into T. 420 Without 4-HNE, the chance of error is about a millionth of a percent. 421 In other words, 4-HNE increases the chances of a DNA mutation rate roughly a million times!

Second, 4-HHE (4-hydroxy-hexanal), which is very much like 4-HNE, his more notorious bigger brother derived from omega-6, but 4-HHE is derived instead from omega-3. If bad guys had sidekicks, 4-NHE’s would be 4-HHE. Because 4-HHE does many of the same things to DNA as 4-HNE, but has only been discovered recently. 422 You see, when omega-6 reacts with oxygen, it breaks apart into two major end products, whereas omega-3, being more explosive, flies apart into four different molecules. This means each one is present in smaller amounts, and that makes them a little more difficult to study. But it doesn’t make 4-HHE any less dangerous. 4-HHE specializes in burning through your glutathione peroxidase antioxidant defense system. 423 This selenium-based antioxidant enzyme is one of the three major enzymatic antioxidant defense systems, and it may be the most important player defending your DNA against oxidative stress. 424 , 425

Finally, there is malonaldehyde (MDA), proven to be a mutagen in 1984, but presumed to only come from consumption of cooked and cured meats. 426 Only in the past few decades have we had the technology to determine that MDA can be generated in our bodies as well. 427 And unlike the previous two chemicals, MDA is generated by oxidation of both omega-3 and omega-6. It may be the most common endogenously derived oxidation product. Dr. J. L. Marnett, who directs a cancer research lab at Vanderbuit University School of Medicine, Nashville, Tennessee, and who has published over 400 articles on the subject of DNA mutation, summarized his final article on MDA with the definitive statement that MDA “appears to be a major source of endogenous DNA damage [endogenous, here, meaning due to internal, metabolic factors rather than, say, radiation] in humans that may contribute significantly to cancer and other genetic diseases.” 428

There’s one more thing I need to add about vegetable-oil-derived toxic breakdown products, particularly given the long list of toxins now being investigated as potential causes of autism spectrum disorders. Not only do they directly mutate DNA, they also make DNA more susceptible to mutations induced by other environmental pollutants. 429 , 430 This means that if you start reading labels and taking vegetable oil out of your diet, your body will more readily deal with the thousands of contaminating toxins not listed on the labels which are nearly impossible to avoid.

Why all this focus on genes when we’re talking about autism? Nearly every day a new study comes out that further consolidates the consensus among scientists that autism is commonly a genetic disorder. The latest research is focusing on de novo mutations, meaning mutations neither parent had themselves but that arose spontaneously in their egg, sperm, or during fertilization. These mutations may affect single genes, or they may manifest as copy number variations, in which entire stretches of DNA containing multiple genes are deleted or duplicated. Geneticists have already identified a staggering number of genes that appear to be associated with autism. In one report summarizing results of examining 900 children, scientists identified 1,000 potential genes: “exome sequencing of over 900 individuals provided an estimate of nearly 1,000 contributing genes.” 431

All of these 1,000 genes are involved with proper development of the part of the brain most identified with the human intellect: our cortical gray matter. This is the stuff that enables us to master human skills: the spoken language, reading, writing, dancing, playing music, and, most important, the social interaction that drives the desire to do all of the above. One need only have a few of these 1,000 genes involved in building a brain get miscopied, or in some cases just one, in order for altered brain development to lead to one’s inclusion in the ASD spectrum.

So just a few troublemaker genes can obstruct the entire brain development program. But for things to go right, all the genes for brain development need to be fully functional.

Given that humans are thought to have only around 20,000 genes, and already 1,000 are known to be essential for building brain, that means geneticists have already labeled 5 percent of the totality of our genetic database as crucial to the development of a healthy brain—and we’ve just started looking. At what point does it become a foolish enterprise to continue to look for genes that, when mutated, are associated with autism? When we’ve identified 5,000? Or 10,000? The entire human genome? At what point do we stop focusing myopically only on those genes thought to play a role in autism?

I’ll tell you when: when you learn that the average autistic child’s genome carries de novo mutations not just in genes thought to be associated with autism, but across the board, throughout the entirety of the chromosomal landscape. Because once you’ve learned this, you can’t help but consider that autism might be better characterized as a symptom of a larger disease—a disease that results in an overall increase in de novo mutations.

Almost buried by the avalanche of journal articles on genes associated with autism is the finding that autistic children exhibit roughly ten times the number of de novo mutations compared to their typically developing siblings. 432 An international working group on autism pronounced this startling finding in a 2013 article entitled: “Global Increases in Both Common and Rare Copy Number Load Associated With Autism.” 433 ( Copy number load refers to mutations wherein large segments of genes are duplicated too often.) What the article says is that yes, children with autism have a larger number of de novo mutations, but the majority of their new mutations are not statistically associated with autism because other kids have them, too. The typically developing kids just don’t have nearly as many.

These new mutations are not only affecting genes associated with brain development. They are affecting all genes seemingly universally. What is more, there is a dose response relationship between the total number of de novo mutations and the severity of autism such that the more gene mutations a child has (the bigger the dose of mutation), the worse their autism (the larger the response). And it doesn’t matter where the mutations are located—even in genes that have no obvious connection to the brain. 434 This finding suggests that autism does not originate in the brain, as has been assumed. The real problem—at least for many children—may actually be coming from the genes. If this is so, then when we look at a child with autism, what we’re seeing is a child manifesting a global genetic breakdown. Among the many possible outcomes of this genetic breakdown, autism may simply be the most conspicuous, as the cognitive and social hallmarks of autism are easy to recognize.

As the authors of the 2013 article state, “Given the large genetic target of neurodevelopmental disorders, estimated in the hundreds or even thousands of genomic loci, it stands to reason that anything that increases genomic instability could contribute to the genesis of these disorders.” 435 Genomic instability —now they’re on to something. Because framing the problem this way helps us to ask the more fundamental question, What is behind the “genomic instability” that’s causing all these new gene mutations?

In the section titled “What Makes DNA Forget” in Chapter 2 , I touched upon the idea that an optimal nutritional environment is required to ensure the accurate transcription of genetic material and communication of epigenetic bookmarking, and how a pro-oxidative, pro-inflammatory diet can sabotage this delicate operation in ways that can lead to mutation and alter normal growth. There I focused on mistakes made in epigenetic programming, what you could call de novo epigenetic abnormalities. The same prerequisites that support proper epigenetic data communication, I submit, apply equally to the proper transcription of genetic data.

What’s the opposite of a supportive nutritional environment? A steady intake of pro-inflammatory, pro-oxidative vegetable oil that brings with it the known mutagenic compounds of the kind I’ve just described. Furthermore, if exposure to these vegetable oil-derived mutagens causes a breakdown in the systems for accurately duplicating genes, then you might expect to find other detrimental effects from this generalized defect of gene replication. Indeed we do. Researchers in Finland have found that children anywhere on the ASD spectrum have between 1.5 and 2.7 times the risk of being born with a serious birth defect, most commonly a life-threatening heart defect or neural tube (brain and spinal cord) defect that impairs the child’s ability to walk. 436 Another group, in Nova Scotia, identified a similarly increased rate of minor malformations, such as abnormally rotated ears, small feet, or closely spaced eyes. 437

What I’ve laid out here is the argument that the increasing prevalence of autism is best understood as a symptom of De Novo Gene Mutation Syndrome brought on by oxidative damage, and that vegetable oil is the number-one culprit in creating these new mutations. These claims emerge from a point-by-point deduction based on the best available chemical, genetic, and physiologic science. To test the validity of this hypothesis, we need more research.

Does De Novo Gene Mutation Syndrome Affect Just the Brain?

Nothing would redirect the trajectory of autism research in a more productive fashion than reframing autism as a symptom of the larger underlying disease, which we are provisionally calling de novo gene-mutation syndrome, or DiNGS. (Here’s a mnemonic: vegetable oil toxins “ding” your DNA, like hailstones pockmarking your car.)

If you accept my thesis that the expanding epidemic of autism is a symptom of an epidemic of new gene mutations, then you may wonder why the only identified syndrome of DiNGS is autism. Why don’t we see all manner of new diseases associated with gene mutations affecting organs other than the brain? We do. According to the most recent CDC report on birth defect incidence in the United States, twenty-nine of the thirty-eight organ malformations tracked have increased. 438

However, these are rare events, occurring far less frequently than autism. The reason for the difference derives from the fact that the brain of a developing baby can be damaged to a greater degree than other organs can, while still allowing the pregnancy to carry to term. Though the complex nature of the brain makes it the most vulnerable in terms of being affected by mutation, this aberration of development does not make the child more vulnerable in terms of survival in utero. The fact that autism affects the most evolutionarily novel portion of the brain means that as far as viability of an embryo is concerned, it’s almost irrelevant. If the kinds of severely damaging mutations leading to autism were to occur in organs such as the heart, lungs, or kidneys, fetal survival would be imperiled, leading to spontaneous miscarriage. Since these organs begin developing as early as four to six weeks of in-utero life, failure of a pregnancy this early might occur without any symptoms other than bleeding, which might be mistaken for a heavy or late period, and before a mother has even realized she’s conceived.

* * *

Rhonda Patrick’s view is similar to that of Shanahan: