Carbohydrates, Essential Nutrients, and Official Dietary Guidelines

“You’ll be reassured to know that you don’t have to eat carbohydrates to live. It’s not an essential nutrient.
“It’s one of the first things we learn in nutrition is what does the body not make and what you HAVE to eat.
“You won’t find carbohydrate on this list.”
~Eric Westman, There’s no such thing as an essential carbohydrate

“Carbohydrates are not essential nutrients.”
~Denise R. Ferrier, Biochemistry

“Carbohydrates are not essential nutrients.”
~Simon W. Walker, Peter Rae, Peter Ashby, & Geoffrey Beckett, Clinical Biochemistry

“Carbohydrates are not considered essential.”
~Carie Ann Braun & Cindy Miller Anderson, Pathophysiology: Functional Alterations in Human Health

“No specific carbohydrates have been identified as dietary requirements.”
~Michael Lieberman, Allan D. Marks, & Alisa Peet , Marks’ Basic Medical Biochemistry: A Clinical Approach

“In the absence of dietary carbohydrate, the body is able to synthesize glucose from lactic acid, certain amino acids and glycerol via gluconeogenesis.”
~Jim Mann & A. Stewart Truswell, Essentials of Human Nutrition

“Even when a person is completely fasting (religious reasons, medically supervised, etc.) the 130 g / day of glucose needed by the brain is made from endogenous protein and fat.
“When people are “fasting” the 12 hour period from the end of supper the night before until breakfast (“break the fast”) the next day, their brain is supplied with essential glucose! Otherwise, sleeping could be dangerous.”
~Joy Kiddie, How Much Carbohydrate is Essential in the Diet?

Dietary Reference Intakes for Energy, Carbohydrate, Fiber, Fat, Fatty Acids, Cholesterol, Protein, and Amino Acids
from National Academies of Sciences, Engineering, and Medicine
published by Institutes of Medicine
2005 textbook of the US Food and Nutrition Board

The lower limit of dietary carbohydrate compatible with life apparently is zero, provided that adequate amounts of protein and fat are consumed. However, the amount of dietary carbohydrate that provides for optimal health in humans is unknown. There are traditional populations that ingested a high fat, high protein diet containing only a minimal amount of carbohydrate for extended periods of time (Masai), and in some cases for a lifetime after infancy (Alaska and Greenland Natives, Inuits, and Pampas indigenous people) (Du Bois, 1928; Heinbecker, 1928). There was no apparent effect on health or longevity. Caucasians eating an essentially carbohydrate-free diet, resembling that of Greenland natives for a year tolerated the diet quite well. However, a detailed modern comparison with populations ingesting the majority of food energy as carbohydrate has never been done.

Why Won’t We Tell Diabetics the Truth?
by Diana Rodgers

They base the carbohydrate requirement of 87g-112 grams per day on the amount of glucose needed to avoid ketosis. They arrived at the number 100g/day to be “the amount sufficient to fuel the central nervous system without having to rely on a partial replacement of glucose by ketoacid,” and then they later say that “it should be recognized that the brain can still receive enough glucose from the metabolism of the glycerol component of fat and from the gluconeogenic amino acids in protein when a very low carbohydrate diet is consumed.” (Meaning, ketosis is NO BIG DEALIn fact, it’s actually a good thing and is not the same as diabetic ketoacidosis that type 1 diabetics and insulin dependent type 2 diabetics can get.) The RDA of 130g/day was computed by using a CV of 15% based on the variation in brain glucose utilization and doubling it, therefore the the RDA (recommended daily allowance) for carbohydrate is 130% of the EAR (estimated average requirement).

Added sugars drive nutrient and energy deficit in obesity: a new paradigm
by James J DiNicolantonio and Amy Berger

Mankind has survived without isolated, refined sugar for almost 2.6 million years.48 The body—in particular, the brain—has been thought to require upwards of 200 g of glucose per day, leading to the often cited dogma that glucose is ‘essential for life’.1 While it is true that glucose is essential for sustaining life, there is no requirement for dietary glucose, as fatty acids can be turned into brain-fuelling ketone bodies, and amino acids and glycerol are gluconeogenic substrates.49 Indeed, in the relative absence of dietary glucose, ketone bodies may supply upwards of 75% of the brain’s required energy, with the remainder supplied by gluconeogenesis provided by amino acids (from dietary protein or catabolism of body proteins) and from glycerol (provided by the breakdown of triglycerides in adipose tissue).33 Thus, exogenous glucose (eg, from added sugars) is not essential for sustaining life in humans, and in most people, restricting dietary carbohydrates seems to produce no ill effects.49 In fact, according to the Food and Nutrition Board of the Institute of Medicine of the US National Academies of Sciences, ‘The lower limit of dietary carbohydrate compatible with life apparently is zero, provided that adequate amounts of protein and fat are consumed’.50

Administration of fructose or sucrose in humans has been shown to cause each of the abnormalities that define the metabolic syndrome (eg, elevated triglycerides, low high-density lipoprotein, insulin resistance, glucose intolerance, elevated blood glucose, elevated blood pressure and weight gain (specifically around the abdomen)),30 51–55 as well as features found in patients with coronary heart disease (eg, increased platelet adhesiveness and hyperinsulinaemia),56 57 all of which can be reversed entirely upon reverting to a diet low in sugar.47 52 56 58–60 Consumption of added sugars at current levels of intake is proposed as a contributing factor in a multitude of other diseases associated with early mortality, such as cardiometabolic disease,61–64 obesity,30 61 65–68 β-cell dysfunction and type 2 diabetes,6 20 69–71 hypertension,51 64 72 non-alcoholic fatty liver7 and atherosclerosis.6 73 74 Because of this, added sugars cannot be considered food.

What to Eat: The Ten Things You Really Need to Know to Eat Well and Be Healthy
by Luise Light, pp. 18-21, 

The alterations that were made to the new guide would be disastrous, I told my boss, the agency director. These changes would undermine the nutritional quality of eating patterns and increase risks for obesity and diabetes, among other diseases. No one needs that much bread and cereal in a day unless they are longshoremen or football players, and it would be unhealthy for the rest of us, especially people who are sedentary or genetically prone to obesity and diabetes. […]

At stake here, I told him, was nothing short of the credibility and integrity of the USDA as a source of reliable nutrition information. Over my objections, the alterations were included and the guide was finalized. I was told this was done in order to keep the lid on the costs of the food stamp program. Fruits and vegetables were expensive, much more expensive than breads and cereals, and the added servings of grains would, to some extent, offset the loss of nutrients from fruits and vegetables, the head of our division told me. However, the logic of that rationale escaped me.

Refined wheat products are what we called in the nutrition trade “cheap carbos,” stomach-filling food preferred when other, higher quality foods are unavailable or not affordable. They do little—if anything—to boost the nutritional quality of people’s diets and tend to add not only starch, but also fat and sugar to the diet. It was curious that there had been no discussion of the cost constraints of the food stamp program in any previous discussion over the many months we had been working on the guide. Intuitively, I knew I was being “played,” but other than stalling and requesting additional outside reviews I felt stymied.

Later, I remembered a Pan American Health Organization (PAHO) nutrition survey I had participated in during graduate school. One of our findings was a high rate of obesity among women in a particular region of the Caribbean country we were working in that had the lowest employment and per capita income. It puzzled me that the poorest region would have the most obese people until one of the physicians on our team explained that the prevalence of obesity was consistent with what he called an “impoverished diet,” too little nutritious food that caused people to feel hungry all the time, and with only cheap carbohydrates available to them, their hunger was never appeased, so they ate and ate and became fatter and fatter.

Was this inflated grain recommendation, I wondered, setting us up for a third world obesity scenario in our own country? Historically, the food guide was used to calculate the cost basis of the food stamps program. Did that mean we needed to develop two different sets of standards for nutrition, one for poor people and another for those better off, or did it mean that what was affordable in the food stamps program would determine what was best for the rest of us? Neither of these Hobson’s choices could be justified on scientific or ethical grounds. The changes that were made to the guide meant that any food product containing wheat flour, from white bread, Twinkies, Oreos, and bagels to pop toasters and Reese’s Puffs, would be considered nutritionally equivalent, which was not the case.

With my protests falling on deaf ears, the serving suggestions in the revised guide were incorporated into the regulations for the food stamps program, as well as the school breakfast and lunch, day care, and all other feeding programs administered by the USDA. Later, Congress set the serving amounts into legislative “stone” so it would be against the law not to serve the expanded number of grain servings that were in the new guide, a change that meant a financial windfall for the wheat industry. The new rules for school lunch programs increased the amount of bread and cereal products purchased for the program by 80 percent. For children in grades K through six, it meant eight daily servings of breads, cereals, and pasta, and for grades seven through twelve, ten servings.

For wheat growers, this meant an increase of 15 million bushels of wheat sold annually worth about $50 million and a retail sales boost of $350 million from additional sales of cereals, breads, and snacks. That didn’t include the extra sales resulting from the government subsidized food stamps program or revenues from the industry’s own efforts to shift public consumption toward more bread, pasta, and baked goods because of the new recommendations. Throughout the nineties, Americans increased their consumption of refined grain products from record lows in the 1970s to the six to eleven servings suggested in the new guide.

* * *

Partial credit for some of the quoted material goes to Bill Murrin, from comments he left at the article Dietary guidelines shouldn’t be this controversial; published at Marion Nestle’s website, Food Politics.

Medical-Industrial Complex

“Unless we put medical freedom into the Constitution, the time will come when medicine will organize into an undercover dictatorship…To restrict the art of healing to one class of men and deny equal privileges to others will constitute the Bastille of medical science. All such laws are un-American and despotic…, and have no place in a republic…The Constitution of this Republic should make special provisions for medical freedom as well as religious freedom.”

Dr. Benjamin Rush, signer of Declaration of Independence, member of Continental Congress

“The efforts of the medical profession in the US to control:…its…job it proposes to monopolize. It has been carrying on a vigorous campaign all over the country against new methods and schools of healing because it wants the business…I have watched this medical profession for a long time and it bears watching.”

Clarence Darrow (1857-1938), Populist leader and lawyer

“Medicine is a social science and politics is a medicine on a large scale…The very words ‘Public Health’ show those who are of the opinion that medicine has nothing to do with politics the magnitude of their error.”

Rudolf Virchow, (1821-1902) founder of cellular pathology

“The profession to which we belong, once venerated…-has become corrupt and degenerate to the forfeiture of its social position…”

Dr. Nathaniel Chapman, first president, AMA, 1848

In 1922, Herbert McLean Evans and Katharine Scott Bishop discovered vitamin E. Then in the following decades from the 1930s to the 1940s, Drs. Wilfred and Evan Shute treated 30,000 patients with natural vitamin E in their clinic and studied it’s health benefits. Despite all of the documented evidence, they had little influence in mainstream nutrition and medicine. They had the disadvantage of promoting a vitamin right at the beginning of the era when pharmaceuticals were getting all of the attention: “Better Living through chemistry.” Responding to the resistance of medical authorities, from his book The Heart and Vitamin E (1956), Dr. Evans Shute wrote that,

“It was nearly impossible now for anyone who valued his future in Academe to espouse Vitamin E, prescribe it or advise its use. That would make a man a “quack” at once. This situation lasted for many years. In the United States, of course, the closure of the JAMA pages against us and tocopherol meant that it did not exist. It was either in the U.S. medical bible or it was nought. No amount of documentation could budge medical men from this stance. Literature in the positive was ignored and left unread. Individual doctors often said: ‘If it is as good as you say, we would all be using it.’ But nothing could induce them as persons of scientific background to make the simplest trial on a burn or coronary.”

In the article Drs. Wilfrid and Evan Shute Cured Thousands with Vitamin E, Andrew W. Saul emphasized this suppression of new knowledge:

“The American Medical Association even refused to let the Shute’s present their findings at national medical conventions. (p 148-9) In the early 1960’s, the United States Post Office successfully prevented even the mailing of vitamin E. (p 166).” Over the decades, others have taken note of the heavy-handedness of mainstream authorities. “The failure of the medical establishment during the last forty years,” wrote Linus Pauling in his 1985 Foreword, “to recognize the value of Vitamin E in controlling heart disease is responsible for a tremendous amount of unnecessary suffering and for many early deaths. The interesting story of the efforts to suppress the Shute discoveries about Vitamin E illustrates the shocking bias of organized medicine against nutritional measures for achieving improved health.”

What is motivating this ‘failure’? And is it really a failure or simply serving other interests, maybe quite successfully at that?

* * *

“Today, expulsion is again mustered into service in a war of ideology. …Modern society makes its heresies out of political economy…Ethics has always been a flexible, developing notion of medicine, with a strong flavor of economics from the start.”

Oliver Garceau, Dept. of Government, Harvard U., The Political Life of the AMA (1941)

“Everyone’s heard about the military-industrial complex, but they know very little about the medical-industrial complex…(in) a medical arms race…”

California Governor Jerry Brown, June 1980

“The new medical-industrial complex is now a fact of American life…with broad and potentially troubling implications…”

Dr. Arnold Relman, Editor, New England Journal of Medicine

“Bankers regard research as most dangerous and a thing that makes banking hazardous due to the rapid changes it brings about in industry.”

Charles Kettering, of Memorial Sloan Kettering Cancer Center, and Vice President of General Motors, (in Ralph Moss, Cancer Syndrome)

“The system of influence and control..is highly skewed in favor of the corporate and financial system. And this dominant influence is felt not only in universities, foundations, and institutions of higher learning, but also…from media to all other instruments of communication.”

Vincente Navarro, (Professor of Health and Social Policy, John Hopkins U., and other credentials).

“In the feeding of hospital patients, more attention should be given to providing tasty and attractive meals, and less to the nutritive quality of the food.”
“People say that all you get out of sugar is calories, no nutrients…There is no perfect food, not even mother’s milk.”
“Have confidence in America’s food industry, it deserves it.”

Dr. Frederick Stare, Harvard U. School of Public Health, Nutrition Dept. Head

So, why are the powers that be so concerned with harmless supplements that consumers take in seeking self-healing and well-being? The FDA explained it’s motivativions:

“It has been common…to combine such unproven ingredients as bio-flavinoids, rutin…, with such essential nutrients as Vitamin C…, thus implying that they are all nutritionally valuable for supplementation of the daily diet. The courts have sustained FDA legal action to prevent such practices, and the new FDA regulations preclude this type of combination in the future…Similarly, it has been common…to state or imply that the American diet is inadequate because of soil deficiencies, commercial processing methods, use of synthetic nutrients, and similar charges. FDA recognizes that these false statements have misled, scared, and confused the public, and is prohibiting any such general statements in the future…The medical and nutritional professions have shown strong support of this policy,…” (FDA Assistant General council’s letter to 5 US Legislators, Hearings, US Congress, 1973).

To give a further example of this contorted thinking, consider another statement from an FDA official: “It is wholly unscientific to state that a well-fed body is more able to resist disease than a less well-fed body” (FDA’s Head of Nutrition Department, Dr. Elmer M. Nelson. in Gene Marin and Judith Van Allen, Food Pollution: The Violation of Our Inner Ecology). That is so absurd as to be unbelievable. Yet it’s sadly expected when one knows of incidents like Ancel Keys attack on John Yudkin amidst wholesale silencing of his detractors and the more recent high level persecution of Tim Noakes, along with dozens of other examples.

The advocates of natural healing and sellers of nutritional supplements were criticizing the dominant system of big ag, big drug, and closely related industries. This was a challenge to power and profit, and so it could not be tolerated. One wouldn’t want the public to get confused… nor new generations of doctors, as explained the Harvard Medical School Dean, Dr. David Edsall: “…students were obliged…to learn about an interminable number of drugs, many…valueless, …useless, some…harmful. …there is less intellectual freedom in the medical course than in almost any other form of professional education in this country.”

This is how we end up with young doctors, straight out of medical school, failing a basic test on nutrition (Most Mainstream Doctors Would Fail Nutrition). Who funds much of the development of medical school curruicula? Private corporations, specifically big drug and big food, and the organizations that represent them. Once out of medical school, some doctors end up making millions of dollars by working for industry on the side, such as giving speeches to promote pharmaceuticals. Also, continuing education and scientific conferences are typically funded by this same big money from the private sphere. There is a lot of money slushing around, not to mention the small briberies of free vacations and such given to doctors. It’s a perverse incentive and one that was carefully designed to manipulate and bias the entire healthcare system.

* * *

“[Doctors] collectively have done more to block adequate medical care for people of this country than any other single group.”

President Jimmy Carter

“I think doctors care very deeply about their patients, but when they organize into the AMA, their responsibility is to the welfare of doctors, and quite often, these lobbying groups are the only ones that are heard in the state capitols and in the capitol of our country.”

President Jimmy Carter

“The FDA and much, but not all, of the orthodox medical profession are actively hostile against vitamins and minerals… They are out to get the health food industry…And they are trying to do this out of active hostility and prejudice.”

Senator William Proxmire (in National Health Federation Bulletin, April, 1974

“Eminent nutritionists have traded their independence for the food industry’s favors.”

US Congressman Benjamin Rosenthal

“The problem with ‘prevention’ is that it does not produce revenues. No health plan reimburses a physician or a hospital for preventing a disease.”

NCI Deputy Director, Division of Cancer Cause and Prevention; and of Diet, Nutrition and Cancer Program

“What is the explanation for the blind eye that has been turned on the flood of medical reports on the causative role of carbohydrates in overweight, ever since the publication in 1864 of William Banting’s famous “Letter on Corpulence”? Could it be related, in part, to the vast financial endowments poured into the various departments of nutritional education by the manufacturers of our refined carbohydrate foodstuff?”

Robert C. Atkins, MD, Dr. Atkins Diet Revolution, c. 1972

“Although the stated purpose of licensure is to benefit the public…Consumers…have learned that licensing may add to the cost of services, while not assuring quality….Charges…the legal sector that licensure restricts competition, and therefore unnecessarily increases costs to consumers….Like other professionals, dietiticians can justify the enactment of licensure laws because licensing affords the opportunity to protect dietiticians from interference in their field by other practitioners…This protection provides a competitive advantage, and therefore is economically beneficial for dietiticians”

ADA President, Marilyn Haschske, JADA, 1984

“While millions of dollars were being projected for research on radiation and other cancer ‘cures’, there was an almost complete blackout on research that might have pointed to needed alterations in our environment, our industrial organization, and our food.”

Carol Lopate, in Health Policy Advisory Center, Health PAC Bulletin

“Research in the US has been seriously affected by restrictions imposed by foreign cartel members. …It has attempted to suppress the publication of scientific research data which were at variance with its monopoly interest. …The hostility of cartel members toward a new product which endangers their control of the market(:)…In the field of synthetic hormones, the cartel control has been …detrimental to our national interest.”

US Assistant Attorney General, Wendell Berge, Cartels, Challenge to the Free World. – in Eleanor McBean, The Poisoned Needle

“We are aware of many cases in industry, government laboratories, and even universities where scientists have been retaliated against when their professional standards interfered with the interests of their employers or funders. This retaliation has taken many forms, ranging from loss of employment and industry-wide blacklisting to transfers and withholding of salary increases and promotions. We are convinced that the visible problem is only the tip of the iceberg.”

American Chemical Society President, Alan C. Nixon, (in Science, 1973)

Similar to the struggles of the Shute brothers, this problem was faced faced by the early scientists studying the ketogenic diet and the early doctors using it to treat patients with epilepsy. The first research and application of the ketogenic diet began in the 1920s and it was quickly found useful for other health conditions. But after a brief period of interest and funding, the research was mostly shut down in favor of the emerging new drugs that could be patented and marketed. It was irrelevant that the keto diet was far more effective than any drugs produced then or since. The ketogenic diet lingered on in a few hospitals and clinics, until research was revived in the 1990s, about three-quarters of a century later. Yet, after hundreds of studies proving its efficacy for numerous diseases (obesity, diabetes, multiple sclerosis, Alzheimer’s, etc), mainstream authority figures and the mainstream media continue to dismiss it and spread fear-mongering, such as false and ignorant claims about ketoacidosis and kidney damage.

Also, consider X-ray technology that was invented by Dr. Émil Herman Grubbé in 1896. He then became the first to use X-rays for cancer treatment. Did the medical profession embrace this great discovery? Of course not. It wasn’t acknowledged as useful until 1951. When asked what he thought about this backward mentality denying such a profound discovery, Dr. Grubbé didn’t mince words: “The surgeons. They controlled medicine, and they regarded the X-ray as a threat to surgery. At that time surgery was the only approved method of treating cancer. They meant to keep it the ‘only’ approved method by ignoring or rejecting any new methods or ideas. This is why I was called a ‘quack’ and nearly ejected from hospitals where I had practiced for years” (Herbert Bailey, Vitamin E: Your Key to a Healthy Heart). As with the Shute brothers, he was deemed a ‘quack’ and so case closed.

There have been many more examples over the past century, in particular during the oppressive Cold War era (Cold War Silencing of Science). The dominant paradigm during McCarthyism was far from limited to scapegoating commies and homosexuals. Anyone stepping out of line could find themselves targeted by the powerful. This reactionary impulse goes back many centuries and continues to exert its influence to this day, continues to punish those who dare speak out (Eliminating Dietary Dissent). This hindering of innovation and progress may be holding civilization back by centuries. We seem unable of dealing with the simplest of problems, even when we already have the knowledge of how to solve those problems.

* * *

“Relevant research on the system as a whole has not been done… It is remarkable that with the continuing health care ‘crisis’, so few studies of the consequences of alternative modes of delivering care have been done. Such a paucity of studies is no accident; such studies would challenge structural interests of both professional monopoly (MD’s) and corporate rationalization in maintaining health institutions as they now exist or in directing their ‘orderly’ expansion.”

Robert R. Alford, Professor, UC Santa Cruz, Health Care Politics

“…It seems that public officials are afraid that if they make any move, or say anything antagonistic to the wishes of the medical organization, they will be pounced upon and destroyed. ..Public officials seem to be afraid of their jobs and even of their lives.”

US Senator Elmer Thomas, In Morris A. Bealle, The Drug Story. c. 1949 and 1976

“I think every doctor should know the shocking state of affairs…We discovered they (the FDA) failed to effectively regulate the large manufacturers and powerful interests while recklessly persecuting the small manufacturers. …(The FDA is) harassing (small) manufacturers and doctors…(and) betrays the public trust.”

Senator Edward V. Long. 1967

“The AMA protects the image of the food processors by its constant propaganda that the American food supply is the finest in the world, and that (those) who question this are simply practicing quackery. The food processors, in turn, protect the image of the AMA and of the drug manufacturers by arranging for the USDA and its dietitic cronies to blacklist throughout the country and in every public library, all nutrition books written for the layman, which preach simple, wholesome nutrition and attack …both the emasculation of natural foods and orthodox American medical care, which ignores subtle malnutrition and stresses drug therapy, (“as distinct from vitamin therapy”) for innumerable conditions. The drug manufacturers vigorously support the AMA since only MD’s can prescribe their products.”

Miles H. Robinson, MD; Professor, University of Pennsylvania and Vanderbilt Medical Schools, exhibit in Vitamin, Mineral, and Diet Supplements, Hearings, US House of Representatives, 1973

“The AMA puts the lives and well being of the American citizens well below it’s own special interest…It deserves to be ignored, rejected, and forgotten. No amount of historical gymnastics can hide the public record of AMA opposition to virtually every major health reform in the past 50 years….The AMA has turned into a propaganda organ purveying ‘medical politics’ for deceiving the Congress, the people, and the doctors of America themselves.”

Senator Edward Kennedy, in UPI National Chronicle, 1971

“The hearings have revealed police-state tactics…possibly perjured testimony to gain a conviction,…intimidation and gross disregard for the Constitutional Rights…(of) First, Fourth, Fifth, and Sixth Amendments, (by the FDA)
“The FDA (is) bent on using snooping gear to pry and invade…”
“Instance after instance of FDA raids on small vitamin and food supplement manufacturers. These small, defenseless businesses were guilty of producing products which FDA officials claimed were unnecessary.”
“If the FDA would spend a little less time and effort on small manufacturers of vitamins…and a little more on the large manufacturers of…dangerous drugs…, the public would be better served.”

Senator Long from various Senate hearings

“From about 1850 until the late 1930’s, one of the standing jokes in the medical profession, was about a few idiots who called themselves doctors, who claimed they could cure pneumonia by feeding their patients moldy bread. …Until…they discovered penicillin…in moldy bread!”

P.E. Binzel, MD, in Thomas Mansell, Cancer Simplified, 1977

“Penicillin sat on a shelf for ten years while I was called a quack.”

Sir Alexander Fleming.

“(in)”1914…Dr. Joseph Goldberger had proven that (pellagra) was related to diet, and later showed that it could be prevented by simply eating liver or yeast. But it wasn’t until the 1940’s…that the ‘modern’ medical world fully accepted pellagra as a vitamin B deficiency.”

G. Edward Griffin, World Without Cancer

“…The Chinese in the 9th century AD utilized a book entitled The Thousand Golden Prescriptions, which described how rice polish could be used to cure beri-beri, as well as other nutritional approaches to the prevention and treatment of disease. It was not until twelve centuries later that the cure for beri-beri was discovered in the West, and it acknowledged to be a vitamin B-1 deficiency disease.”

Jeffrey Bland, PhD, Your Health Under Siege: Using Nutrition to Fight Back

“The intolerance and fanaticism of official science toward Eijkman’s observations (that refined rice caused beri-beri) brought about the death of some half million people on the American continent in our own century alone between 1900 and 1910.”

Josue Castro, The Geography of Hunger

“In 1540…Ambroise Paré…persuaded doctors to stop the horrid practice of pouring boiling oil on wounds and required all doctors to wash thoroughly before delivering babies or performing surgery….(in) 1844…Ignaz Semmelweis in Vienna proved…that clean, well-scrubbed doctors would not infect and kill mothers at childbirth. For his efforts Semmelweis was dismissed from his hospital…(and) despite publication, his work was totally ignored. As a result he became insane and died in an asylum, and his son committed suicide.”
“As a chemist working for the US Government in 1916 on the island of Luzon (Philippines), (R.R.) Williams, over the opposition of orthodox medicine, had managed to eradicate beri-beri…by persuading the population to drink rice bran tea. In 1917, Williams was recalled to the US, and thereafter orthodox medicine discouraged anyone from drinking rice bran tea, so by 1920 there were more beri-beri deaths on Luzon than in 1915. ..In 1934, R.R. Williams (now) at Bell Telephone Labs., discovered thiamine (vitamin B-1), and that thiamine in rice bran both prevented and cured beri-beri.”
“Christian Eikman in Holland…shared the Nobel prize for Medicine in 1929 for Proving in 1892 that beri-beri was not an infectious disease…”

Wayne Martin, BS, Purdue University; Medical Heroes and Heretics, & “The Beri-beri analogy to myocardial infarction”, Medical Hypothesis

“In the 1850’s, Ignaz P. Semmelweis, a Hungarian doctor, discovered that childbed fever, which then killed about 12 mothers out of every 100, was contagious…and that doctors themselves were spreading the disease by not cleaning their hands. He was ridiculed…Opponents of his idea attacked him fiercely….(and) brought on (his) mental illness….(he) died a broken man.”

Salem Kirban, Health Guide for Survival

“…Galen…was…forced to flee Rome to escape the frenzy of the mob….Vesalius was denounced as an imposter and heretic…William Harvey was disgraced as a physician…William Roentgen…was called a quack and then condemned…”
“In…1535, when…Jacques Cartier found his ships…in…the St. Lawrence River, scurvy began…and then a friendly Indian showed them (that) tree bark and needles from the white pine – both rich in…Vitamin C – were stirred into a drink (for) swift recovery. Upon returning to Europe, Cartier reported this incident to the medical authorities. But they were amused by such ‘witch-doctor cures of ignorant savages’ and did nothing to follow it up…”
“It took over 200 years and cost hundreds of thousands of lives before the medical experts began to accept…Finally, in 1747, John Lind..discovered that oranges and lemons produced relief from scurvy…and yet it took 48 more years before his recommendation was put into effect….’Limeys’ would soon become rulers of the ‘Seven Seas’…”
“In 1593, Sir Richard Hawkins noted and later published, in observations on his voyage into the South Seas, references that natives of the area used sour oranges and lemons as a cure for scurvy, and a similar result was noted among his crew. …In 1804, regulations were introduced into the British Navy requiring use of lime juice….(and) into law by the British Board of Trade in 1865….It took two centuries to translate empirical observations into action…”

Maureen Salaman, MSc, Nutrition: the Cancer Answer

Most of the above quotes were found on a webpage put together by Wade Frazer (Medical Dark Ages Quotes). He gathered the quotes from Ralph Hovnanian’s 1990 book, Medical Dark Ages.

Americans Fatter at Same Level of Food Intake and Exercise

Americans, to state the obvious, are unhealthier with each passing generation. And the most obvious sign of this is rising obesity rate. In one analysis, this was shown to be true even when controlling for levels of food intake and exercise (see article below). This is the kind of data that undermines conventional dietary advice based on Christian moralizing about the deadly sins of gluttony and sloth.

Heart attacks and obesity first became a public health concern in the 1940s and 1950s. That was following decades of seed oil and margarine consumption having mostly replaced lard in the American diet. We were told that saturated fat is dangerous and that seed oils were great for health. Americans were listening and they strictly followed this advice. Even restaurants stopped cooking their french fries in tallow.

In particular, olive oil has been sold as the best. Why is olive oil supposed to be so healthy? Because it has monounsaturated fat, the same as is primarily found in lard. Not too long ago, the healthiest population in the United States was in Roseto, Pennyslvania. Guess what was their main source of fat? Lard. They also ate massive loads of meat, as do other long-lived populations in the world such as in Hong Kong.

Red meat also decreased over that period and has continued to increase since then. Dairy has followed this pattern of decline. Americans are eating less animal fats now than ever before in American history or probably human existence. It’s true that Americans are eating more lean chicken and fish, but we were told those are healthy for us. Meanwhile, Americans are eating more fruits and vegetables, nuts and seeds than ever before.

Calories-in/calories-out has been an utter failure. It’s not how much we are eating but what we are eating. That then determines how our metabolism functions, whether it burns fat or stores it. Exercise is largely irrelevant for fat loss. Fat people can exercise all the time and not lose weight, while some skinny people hardly move at all. Another study “demonstrated that there is no difference in total energy expenditure between traditional hunter-gathers, subsistence farmers and modern Westerners.”

One explanation is an increase of obesogens. These are chemicals that cause the body to create fat. In general, fat is where the body stores excess toxins that overwhelm the body. And indeed younger Americans are exposed to more toxins. Then this makes losing weight hard because all the toxins get released and make one feel like shit. It’s hard for the body to eliminate a lifetime of accumulated toxicity. On top of that, the young are prescribed more medications than ever before. Antidepressants and antipsychotics have been given out like candy for anyone with mild mental issues. What is a common side effect of these drugs? Yep, weight gain.

A third possibility is more complex. We know the gut microbiome has shrunk in number and diversity. It’s also changed in the profile of bacteria. Research is showing how important is the microbiome (see The Secrete Life of Your Microbiome by Susan L. Prescott and Alan C. Logan). Toxins and drugs, by the way, also alter the microbiome. So does diet. Even if total calorie intake hasn’t changed much relative to the increased height of the population, what has changed is what we are eating.

In place of animal fats, we are eating not only more seed oils but also more carbs and sugar. Animal fats are highly satiating and so food companies realized they needed to find something equally satiating. It turns out a high-carb diet is not only satiating but addictive. It knocks people out of ketosis and causes them to put on weight. It doesn’t matter if one tries to eat less. In processed foods, when carbs are combined with seed oils, the body is forced to burn the carbs immediately and so it has no choice but to turn the seed oils into fat.

By the way, what alters metabolism also alters the microbiome. This is seen when people go from a high-carb diet to a ketogenic diet. Ketosis is powerful in its impact on how the body functions in so many ways, even changing epigenetic expression of genes. Here is the worst part. Those epigenetic changes have been happening for generations with the loss of regular ketosis. Even epigenetics for obesity, following an environmental trigger like famine, have been shown to pass on across multiple generations. The microbiome, of course, also is inherited and each of those bacteria likewise have an epigenome that determines their genetic expression.

Everything we do as individuals, good and bad, doesn’t only affect us as individuals. People are getting fatter now not only because of what they are doing differently but because of everything that was done by their parents, grandparents, and great-grandparents. As I’ve said before, even if we reversed all these changes instantly, as we are unlikely to do, it would still require generations to fully reverse the consequences.

* * *

Why It Was Easier to Be Skinny in the 1980s
by Olga Khazan

A study published recently in the journal Obesity Research & Clinical Practice found that it’s harder for adults today to maintain the same weight as those 20 to 30 years ago did, even at the same levels of food intake and exercise. […]

Just what those other changes might be, though, are still a matter of hypothesis. In an interview, Kuk proffered three different factors that might be making harder for adults today to stay thin.

First, people are exposed to more chemicals that might be weight-gain inducing. Pesticides, flame retardants, and the substances in food packaging might all be altering our hormonal processes and tweaking the way our bodies put on and maintain weight.

Second, the use of prescription drugs has risen dramatically since the 1970s and ’80s. Prozac, the first blockbuster SSRI, came out in 1988. Antidepressants are now one of the most commonly prescribed drugs in the U.S., and many of them have been linked to weight gain.

Finally, Kuk and the other study authors think that the microbiomes of Americans might have somehow changed between the 1980s and now. It’s well known that some types of gut bacteria make a person more prone to weight gain and obesity. Americans are eating more meat than they were a few decades ago, and many animal products are treated with hormones and antibiotics in order to promote growth. All that meat might be changing gut bacteria in ways that are subtle, at first, but add up over time. Kuk believes that the proliferation of artificial sweeteners could also be playing a role.

Why Do Americans Keep Getting Fatter?
by Chris Bodenner

Notwithstanding the known errors of dietary assessment, it is interesting that we observe consistent trends over time in terms of how dietary intake relates with obesity and how this relationship has changed over time. This lends more confidence to our primary findings and suggests that there are either physiological changes in how diet relates with body weight or differences in how individuals are reporting their dietary intake over time. […]

[W]e observed that the BMI associated with a given leisure time physical activity frequency was still higher over time in men. This may be attributed to changes in non-leisure time physical activity such as reductions in occupational physical activity or increasing screen time. However, a study using doubly labelled water demonstrated that there is no difference in total energy expenditure between traditional hunter-gathers, subsistence farmers and modern Westerners. Thus, numerous other factors in addition to energy intake and physical activity may be important to consider when trying to explain the rise in obesity, and should be further evaluated in further studies.

Diet and Industrialization, Gender and Class

Below are a couple of articles about the shift in diet since the 19th century. Earlier Americans ate a lot of meat, lard, and butter. It’s how everyone ate — women and men, adults and children — as that was what was available and everyone ate meals together. Then there was a decline in consumption of both red meat and lard in the early 20th century (dairy has also seen a decline). The changes created a divergence in who was eating what.

It’s interesting that, as part of moral panic and identity crisis, diets became gendered as part of reinforcing social roles and the social order. It’s strange that industrialization and gendering happened simultaneously, although maybe it’s not so strange. It was largely industrialization in altering society so dramatically that caused the sense of panic and crisis. So, diet also became heavily politicized and used for social engineering, a self-conscious campaign to create a new kind of society of individualism and nuclear family.

This period also saw the rise of the middle class as an ideal, along with increasing class anxiety and class war. This led to the popularity of cookbooks within bourgeois culture, as the foods one ate not only came to define gender identity but also class identity. As grains and sugar were only becoming widely available in the 19th century with improved agriculture and international trade, the first popular cookbooks were focused on desert recipes (Liz Susman Karp, Eliza Leslie: The Most Influential Cookbook Writer of the 19th Century). Before that, deserts had been limited to the rich.

Capitalism was transforming everything. The emerging industrial diet was self-consciously created to not only sell products but to sell an identity and lifestyle. It was an entire vision of what defined the good life. Diet became an indicator of one’s place in society, what one aspired toward or was expected to conform to.

* * *

How Steak Became Manly and Salads Became Feminine
Food didn’t become gendered until the late 19th century.
by Paul Freedman

Before the Civil War, the whole family ate the same things together. The era’s best-selling household manuals and cookbooks never indicated that husbands had special tastes that women should indulge.

Even though “women’s restaurants” – spaces set apart for ladies to dine unaccompanied by men – were commonplace, they nonetheless served the same dishes as the men’s dining room: offal, calf’s heads, turtles and roast meat.

Beginning in the 1870s, shifting social norms – like the entry of women into the workplace – gave women more opportunities to dine without men and in the company of female friends or co-workers.

As more women spent time outside of the home, however, they were still expected to congregate in gender-specific places.

Chain restaurants geared toward women, such as Schrafft’s, proliferated. They created alcohol-free safe spaces for women to lunch without experiencing the rowdiness of workingmen’s cafés or free-lunch bars, where patrons could get a free midday meal as long as they bought a beer (or two or three).

It was during this period that the notion that some foods were more appropriate for women started to emerge. Magazines and newspaper advice columns identified fish and white meat with minimal sauce, as well as new products like packaged cottage cheese, as “female foods.” And of course, there were desserts and sweets, which women, supposedly, couldn’t resist.

How Crisco toppled lard – and made Americans believers in industrial food
by Helen Zoe Veit

For decades, Crisco had only one ingredient, cottonseed oil. But most consumers never knew that. That ignorance was no accident.

A century ago, Crisco’s marketers pioneered revolutionary advertising techniques that encouraged consumers not to worry about ingredients and instead to put their trust in reliable brands. It was a successful strategy that other companies would eventually copy. […]

It was only after a chemist named David Wesson pioneered industrial bleaching and deodorizing techniques in the late 19th century that cottonseed oil became clear, tasteless and neutral-smelling enough to appeal to consumers. Soon, companies were selling cottonseed oil by itself as a liquid or mixing it with animal fats to make cheap, solid shortenings, sold in pails to resemble lard.

Shortening’s main rival was lard. Earlier generations of Americans had produced lard at home after autumn pig slaughters, but by the late 19th century meat processing companies were making lard on an industrial scale. Lard had a noticeable pork taste, but there’s not much evidence that 19th-century Americans objected to it, even in cakes and pies. Instead, its issue was cost. While lard prices stayed relatively high through the early 20th century, cottonseed oil was abundant and cheap. […]

In just five years, Americans were annually buying more than 60 million cans of Crisco, the equivalent of three cans for every family in the country. Within a generation, lard went from being a major part of American diets to an old-fashioned ingredient. […]

In the decades that followed Crisco’s launch, other companies followed its lead, introducing products like Spam, Cheetos and Froot Loops with little or no reference to their ingredients.

Once ingredient labeling was mandated in the U.S. in the late 1960s, the multisyllabic ingredients in many highly processed foods may have mystified consumers. But for the most part, they kept on eating.

So if you don’t find it strange to eat foods whose ingredients you don’t know or understand, you have Crisco partly to thank.

 

Dietary Risk Factors for Heart Disease and Cancer

Based on a study of 42 European countries, a recent scientific paper reported that, “the highest CVD [cardiovascular disease] prevalence can be found in countries with the highest carbohydrate consumption, whereas the lowest CVD prevalence is typical of countries with the highest intake of fat and protein.” And that, “The positive effect of low-carbohydrate diets on CVD risk factors (obesity, blood lipids, blood glucose, insulin, blood pressure) is already apparent in short-term clinical trials lasting 3–36 months (58) and low-carbohydrate diets also appear superior to low-fat diets in this regard (36, 37).” Basically, for heart health, this would suggest eating more full-fat dairy, eggs, meat, and fish while eating less starches, sugar, and alcohol. That is to say, follow a low-carb diet. It doesn’t mean eat any low-carb diet, though, for the focus is on animal foods.

By the way, when you dig into the actual history of the Blue Zones (healthy, long-lived populations), what you find is that their traditional diets included large portions of animal foods, including animal fat (Blue Zones Dietary Myth, Eat Beef and Bacon!, Ancient Greek View on Olive Oil as Part of the Healthy Mediterranean Diet). The longest-lived society in the entire world, in fact, is also the one with the highest meat consumption per capita, even more than Americans. What society is that? Hong Kong. In general, nutrition studies in Asia has long shown that those eating more meat have the best health outcomes. This contradicts earlier Western research, as we’re dealing with how the healthy user effect manifests differently according to culture. But even in the West, the research is ever more falling in line with the Eastern research, such as with the study I quoted above. And that study is far from being the only one (Are ‘vegetarians’ or ‘carnivores’ healthier?).

This would apply to both meat-eaters and vegetarians, as even vegetarians could put greater emphasis on nutrient-dense animal foods. It is specifically saturated fat and animal proteins that were most strongly associated with better health, both of which could be obtained from dairy and eggs. Vegans, on the other hand, would obviously be deficient in this area. But certain plant foods (tree nuts, olives, citrus fruits, low-glycemic vegetables, and wine, though not distilled beverages) also showed some benefit. Considering plant foods, those specifically associated with greater risk of heart disease, strokes, etc were those high in carbohydrates such as grains. Unsurprisingly, sunflower oil was a risk factor, probably related to seed oils being inflammatory and oxidative (not to mention mutagenic); but oddly onions were also likewise implicated, if only weakly. Other foods showed up in the data, but the above were the most interesting and important.

Such correlations, of course, can’t prove causation. But it fits the accumulating evidence: “These findings strikingly contradict the traditional ‘saturated fat hypothesis’, but in reality, they are compatible with the evidence accumulated from observational studies that points to both high glycaemic index and high glycaemic load (the amount of consumed carbohydrates × their glycaemic index) as important triggers of CVDs. The highest glycaemic indices (GI) out of all basic food sources can be found in potatoes and cereal products, which also have one of the highest food insulin indices (FII) that betray their ability to increase insulin levels.” All of that seems straightforward, according to the overall data from nutrition studies (see: Uffe Ravnskov, Richard Smith, Robert Lustig, Eric Westman, Ben Bikman, Gary Taubes, Nina Teicholz, etc). About saturated fat not being linked to CVD risk, Andrew Mente discusses a meta-analysis he worked on and another meta-analysis by another group of researchers, Siri-Tarino PW et al (New Evidence Reveals that Saturated Fat Does Not Increase the Risk of Cardiovascular Disease). Likewise, many experts no longer see cholesterol as a culprit either (Uffe Ravnskov et al, LDL-C does not cause cardiovascular disease: a comprehensive review of the current literature).

Yet one other odd association was discovered: “In fact, our ecological comparison of cancer incidence in 39 European countries (for 2012; (59)) can bring another important argument. Current rates of cancer incidence in Europe are namely the exact geographical opposite of CVDs (see Fig. 28). In sharp contrast to CVDs, cancer correlates with the consumption of animal food (particularly animal fat), alcohol, a high dietary protein quality, high cholesterol levels, high health expenditure, and above average height. These contrasting patterns mirror physiological mechanisms underlying physical growth and the development of cancer and CVDs (60). The best example of this health paradox is again that of French men, who have the lowest rates of CVD mortality in Europe, but the highest rates of cancer incidence. In other words, cancer and CVDs appear to express two extremes of a fundamental metabolic disbalance that is related to factors such as cholesterol and IGF-1 (insulin-like growth factor).”

That is an argument people have made, but it’s largely been theoretical. In response, others have argued the opposite position (High vs Low Protein, Too Much Protein?, Gundry’s Plant Paradox and Saladino’s Carnivory, Carcinogenic Grains). It’s true that, for example, eating meat increases IGF-1, at least temporarily. Then again, eating in general does the same. And on a diet low enough in carbs, it’s been shown in studies that people naturally reduce their calorie intake, which would reduce IGF-1. And for really low-carb, the ketogenic diet is specifically defined as being low in animal protein while higher in fat. A low-carb diet is not necessarily a high-animal protein diet, especially when combined with intermittent fasting such as OMAD (one meal a day) with long periods of downregulated IGF-1. Also, this study didn’t appear to include plant proteins in the data, and so we don’t know if eating lots of soy, hemp protein powder, etc would show similar results; although nuts were mentioned in the report as being similar to meat in correlating to CVD health but, as far as I know, not mentioned in terms of cancer. What would make animal proteins more carcinogenic than plant proteins or, for that matter, plant carbohydrates? The hypothetical mechanism is not clear.

This anomaly would’ve been more interesting if the authors had surveyed the research literature. It’s hard to know what to make of it since other studies have pointed to the opposite conclusion, that the risks of these two are closely linked, rather than being inversely associated: “Epidemiologically, a healthy lifestyle lessens the risk of both cardiovascular disease and cancer, as first found in the Nurses’ Health study” (Lionel Opie, Cancer and cardiovascular disease; see Rob M. Van Dam, Combined impact of lifestyle factors on mortality). “Research has shown there are interrelationships among type 2 diabetes, heart disease, and cancer. These interrelationships may seem coincidental and based only on the fact these conditions share common risk factors. However, research suggests these diseases may relate to one another in multiple ways and that nutrition and lifestyle strategies used to prevent and manage these diseases overlap considerably” (Karen Collins, The Cancer, Diabetes, and Heart Disease Link).

Yet other researchers did find the same inverse relationship: “We herein report that, based on two separate medical records analysis, an inverse correlation between cancer and atherosclerosis” (Matthew Li et al, If It’s Not One Thing, It’s Another). But there was an additional point: “We believe that the anti-inflammatory aspect of cancer’s pan-inflammatory response plays an important role towards atherosclerotic attenuation.” Interesting! In that case, one of the key causal mechanisms to be considered is inflammation. Some diets high in animal proteins would be inflammatory, such as the Standard American Diet, whereas others would be anti-inflammatory. Eliminating seed oils (e.g., sunflower oil) would by itself reduce inflammation. Reducing starches and sugar would help as well. So, is it the meat that increases cancer or is it what the meat is being cooked in or eaten with? That goes back to the healthy and unhealthy user effects.

As this confounding factor is central, we might want to consider the increasingly common view that inflammation is involved in nearly every major disease. “For example, inflammation causes or is a causal link in many health problems or otherwise seen as an indicator of health deterioration (arthritis, depression, schizophrenia, etc), but inflammation itself isn’t the fundamental cause since it is a protective response itself to something else (allergens, leaky gut, etc). Or as yet another example, there is the theory that cholesterol plaque in arteries doesn’t cause the problem but is a response to it, as the cholesterol is essentially forming a scab in seeking to heal injury. Pointing at cholesterol would be like making accusations about firefighters being present at fires” (Coping Mechanisms of Health).

What exacerbates or moderates inflammation will be pivotal to overall health (Essentialism On the Decline), especially the nexus of disease called metabolic syndrome/derangement or what used to be called syndrome X: insulin resistance, diabetes, obesity, heart disease, strokes, etc. In fact, other researchers point directly to inflammation as being a common factor of CVD and cancer: “Although commonly thought of as two separate disease entities, CVD and cancer possess various similarities and possible interactions, including a number of similar risk factors (e.g. obesity, diabetes), suggesting a shared biology for which there is emerging evidence. While chronic inflammation is an indispensible feature of the pathogenesis and progression of both CVD and cancer, additional mechanisms can be found at their intersection” (Ryan J. Koene et al, Shared Risk Factors in Cardiovascular Disease and Cancer). But it might depend on the specific conditions how inflammation manifests as disease — not only CVD or cancer but also arthritis, depression, Alzheimer’s, etc.

This is the major downfall of nutrition studies, as the experts in the field find themselves hopelessly mired in a replication crisis. There is too much contradictory research and, when much of the research has been repeated, it simply did not replicate. That is to say much of it is simply wrong or misinterpreted. And as few have attempted to replicate much of it, we aren’t entirely sure what is valid and what is not. That further problemetizes meta-analyses, despite how potentially powerful that tool can be when working with quality research. The study I’ve been discussing here was an ecological study and that has its limitations. The researchers couldn’t disentangle all the major confounding factors, much less control for them in the first place, as they were working with data across decades that came from separate countries. Even so, it’s interesting and useful info to consider. And keep in mind that almost all official dietary recommendations are based on observational (associative, correlative, epidemiological) studies with far fewer controls. This is the nature of the entire field of nutrition studies, as long-term randomized and controlled studies on humans are next to impossible to do.

So, as always, qualifications must be made. The study’s authors state that, “In items of smaller importance (e.g. distilled beverages, sunflower oil, onions), the results are less persuasive and their interpretation is not always easy and straightforward. Similar to observational studies, our ecological study reflects ‘real-world data’ and cannot always separate mutual interactions among the examined variables. Therefore, the reliance on bivariate correlations could lead to misleading conclusions. However, some of these findings can be used as a starting point of medical hypotheses, whose validity can be investigated in controlled clinical trials.” Nonetheless, “The reasonably high accuracy of the input data, combined with some extremely high correlations, together substantially increase the likelihood of true causal relationships, especially when the results concern principal components of food with high consumption rates, and when they can be supported by other sources.”

This data is meaningful in offering strong supporting evidence. The finding about animal foods and starchy foods is the main takeaway, however tentative the conclusion may be for real world application, at least in taking this evidence in isolation. But the inverse correlation of CVD risk and cancer risk stands out and probably indicates confounders across populations, and that would be fertile territory for other researchers to explore. The main importance to this study is less in the specifics and more in how it further challenges the broad paradigm that has dominated nutrition studies for the past half century or so. The most basic point is that the diet-heart hypothesis simply doesn’t make sense of the evidence and it never really did. When the hypothesis was first argued, heart disease was going up precisely at the moment saturated fat intake was going down, since seed oils had replaced lard as the main fat source in the decades prior. Interestingly, lard has been a common denominator among most long-lived populations, from the Okinawans to Rosetans (Ancient Greek View on Olive Oil as Part of the Healthy Mediterranean Die, Blue Zones Dietary Myth).

This study is further support for a new emerging understanding, as seen with the American Heart Association backing off from its earlier position (Slow, Quiet, and Reluctant Changes to Official Dietary Guidelines). Fat is not the enemy of humanity, as seen with the high-fat ketogenic diet where fat is used as the primary fuel, instead of carbohydrates (Ketogenic Diet and Neurocognitive Health, The Ketogenic Miracle Cure, The Agricultural Mind). In fact, we wouldn’t be here without fat, as it is the evolutionary and physiological norm, specifically in terms of low-carb (Is Ketosis Normal?, “Is keto safe for kids?”). Instead, that too many carbohydrates are unhealthy used to be common knowledge (American Heart Association’s “Fat and Cholesterol Counter” (1991)). Consensus on this shifted a half century ago, the last time when low-carb diets were still part of mainstream thought, and now we are shifting back the other way. The old consensus will be new again.

* * *

Carbohydrates, not animal fats, linked to heart disease across 42 European countries
by Keir Watson

Key findings

  • Cholesterol levels were tightly correlated to the consumption of animal fats and proteins – Countries consuming more fat and protein from animal sources had higher incidence of raised cholesterol
  • Raised cholesterol correlated negatively with CVD risk – Countries with higher levels of raised cholesterol had fewer cases of CVD deaths and a lower incidence of CVD risk factors
  • Carbohydrates correlated positively with CVD risk – the more carbohydrates consumed (and especially those with high GI such as starches) the more CVD
  • Fat and Protein correlated negatively with CVD risk – Countries consuming more fat and protein from animal and plant sources had less CVD. The authors speculate that this is because increasing fat and protein in the diet generally displaces carbohydrates.

Food consumption and the actual statistics of cardiovascular diseases: an epidemiological comparison of 42 European countries
Pavel Grasgruber,* Martin Sebera, Eduard Hrazdira, Sylva Hrebickova, and Jan Cacek

Results

We found exceptionally strong relationships between some of the examined factors, the highest being a correlation between raised cholesterol in men and the combined consumption of animal fat and animal protein (r=0.92, p<0.001). The most significant dietary correlate of low CVD risk was high total fat and animal protein consumption. Additional statistical analyses further highlighted citrus fruits, high-fat dairy (cheese) and tree nuts. Among other non-dietary factors, health expenditure showed by far the highest correlation coefficients. The major correlate of high CVD risk was the proportion of energy from carbohydrates and alcohol, or from potato and cereal carbohydrates. Similar patterns were observed between food consumption and CVD statistics from the period 1980–2000, which shows that these relationships are stable over time. However, we found striking discrepancies in men’s CVD statistics from 1980 and 1990, which can probably explain the origin of the ‘saturated fat hypothesis’ that influenced public health policies in the following decades.

Conclusion

Our results do not support the association between CVDs and saturated fat, which is still contained in official dietary guidelines. Instead, they agree with data accumulated from recent studies that link CVD risk with the high glycaemic index/load of carbohydrate-based diets. In the absence of any scientific evidence connecting saturated fat with CVDs, these findings show that current dietary recommendations regarding CVDs should be seriously reconsidered. […]

Irrespective of the possible limitations of the ecological study design, the undisputable finding of our paper is the fact that the highest CVD prevalence can be found in countries with the highest carbohydrate consumption, whereas the lowest CVD prevalence is typical of countries with the highest intake of fat and protein. The polarity between these geographical patterns is striking. At the same time, it is important to emphasise that we are dealing with the most essential components of the everyday diet.

Health expenditure – the main confounder in this study – is clearly related to CVD mortality, but its influence is not apparent in the case of raised blood pressure or blood glucose, which depend on the individual lifestyle. It is also difficult to imagine that health expenditure would be able to completely reverse the connection between nutrition and all the selected CVD indicators. Therefore, the strong ecological relationship between CVD prevalence and carbohydrate consumption is a serious challenge to the current concepts of the aetiology of CVD.

The positive effect of low-carbohydrate diets on CVD risk factors (obesity, blood lipids, blood glucose, insulin, blood pressure) is already apparent in short-term clinical trials lasting 3–36 months (58) and low-carbohydrate diets also appear superior to low-fat diets in this regard (36, 37). However, these findings are still not reflected by official dietary recommendations that continue to perpetuate the unproven connection between saturated fat and CVDs (25). Understandably, because of the chronic nature of CVDs, the evidence for the connection between carbohydrates and CVD events/mortality comes mainly from longitudinal observational studies and there is a lack of long-term clinical trials that would provide definitive proof of such a connection. Therefore, our data based on long-term statistics of food consumption can be important for the direction of future research.

In fact, our ecological comparison of cancer incidence in 39 European countries (for 2012; (59)) can bring another important argument. Current rates of cancer incidence in Europe are namely the exact geographical opposite of CVDs (see Fig. 28). In sharp contrast to CVDs, cancer correlates with the consumption of animal food (particularly animal fat), alcohol, a high dietary protein quality, high cholesterol levels, high health expenditure, and above average height. These contrasting patterns mirror physiological mechanisms underlying physical growth and the development of cancer and CVDs (60). The best example of this health paradox is again that of French men, who have the lowest rates of CVD mortality in Europe, but the highest rates of cancer incidence. In other words, cancer and CVDs appear to express two extremes of a fundamental metabolic disbalance that is related to factors such as cholesterol and IGF-1 (insulin-like growth factor).

Besides total fat and protein consumption, the most likely preventive factors emerging in our study include fruits (particularly citrus fruits), wine, high-fat dairy products (especially cheese), sources of plant fat (tree nuts, olives), and potentially even vegetables and other low-glycaemic plant sources, provided that they substitute high-glycaemic foods. Many of these foodstuffs are the traditional components of the ‘Mediterranean diet’, which again strengthens the meaningfulness of our results. The factor analysis (Factor 3) also highlighted coffee, soybean oil and fish & seafood, but except for the fish & seafood, the rationale of this finding is less clear, because coffee is strongly associated with fruit consumption and soybean oil is used for various culinary purposes. Still, some support for the preventive role of coffee does exist (61) and hence, this observation should not be disregarded.

Similar to the “Mediterranean diet”, the Dietary Approaches to Stop Hypertension (DASH) diet, which is based mainly on fruits, vegetables, and low-fat dairy, also proved to be quite effective (62). However, our data indicate that the consumption of low-fat dairy may not be an optimal strategy. Considering the unreliability of observational studies highlighting low-fat dairy and the existence of strong bias regarding the intake of saturated fat, the health effect of various dairy products should be carefully tested in controlled clinical studies. In any case, our findings indicate that citrus fruits, high-fat dairy (such as cheese) and tree nuts (walnuts) constitute the most promising components of a prevention diet.

Among other potential triggers of CVDs, we should especially stress distilled beverages, which consistently correlate with CVD risk, in the absence of any relationship with health expenditure. The possible role of sunflower oil and onions is much less clear. Although sunflower oil consistently correlates with stroke mortality in the historical comparison and creates very productive regression models with some correlates of the actual CVD mortality, it is possible that both these food items mirror an environment that is deficient in some important factors correlating negatively with CVD risk.

A very important case is that of cereals because whole grain cereals are often propagated as CVD prevention. It is true that whole grain cereals are usually characterised by lower GI and FII values than refined cereals, and their benefits have been documented in numerous observational studies (63), but their consumption is also tied with a healthy lifestyle. All the available clinical trials have been of short duration and have produced inconsistent results indicating that the possible benefits are related to the substitution of refined cereals for whole grain cereals, and not because of whole grain cereals per se (64, 65). Our study cannot differentiate between refined and unrefined cereals, but both are highly concentrated sources of carbohydrates (~70–75% weight, ~80–90% energy) and cereals also make up ~50% of CA energy intake in general. To use an analogy with smoking, a switch from unfiltered to filtered cigarettes can reduce health risks, but this fact does not mean that filtered cigarettes should be propagated as part of a healthy lifestyle. In fact, even some unrefined cereals [such as the ‘whole-meal bread’ tested by Bao et al. (32)] have high glycaemic and insulin indices, and the values are often unpredictable. Therefore, in the light of the growing evidence pointing to the negative role of carbohydrates, and considering the lack of any association between saturated fat and CVDs, we are convinced that the current recommendations regarding diet and CVDs should be seriously reconsidered.

Are ‘vegetarians’ or ‘carnivores’ healthier?

“Animal protein was inversely associated with all-cause and cardiovascular mortality in older adults.”
~Tomas Merono et al, Animal protein intake is inversely associated with mortality in older adults: the InCHIANTI study

“Partial replacement of animal protein foods with plant protein foods led to marked decreases in the intake and status of vitamin B-12 and iodine.”
~Tiina Pellinen et al, Replacing dietary animal-source proteins with plant-source proteins changes dietary intake and status of vitamins and minerals in healthy adults: a 12-week randomized trial

Nutrition studies has been plagued with problems. Most of the research in the past was extremely low quality. Few other fields would allow such weak research to be published in peer-reviewed journals. Yet for generations, epidemiological (observational and correlational) studies were the norm for nutrition studies. This kind of research is fine for preliminary exploration in formulating new hypotheses to test, but it is entirely useless for proving or disproving any given hypothesis. Shockingly, almost all of medical advice and government recommendations on diet and nutrition are based on this superficial and misleading level of results.

The main problem is there has been little, if any, control of confounding factors. Also, the comparisons used were pathetically weak. It turns out that, in studies, almost any dietary protocol or change improves health compared to a standard American diet (SAD) or other varieties of standard industrialized diets based on processed foods of refined carbs (particularly wheat), added sugar (particularly high fructose corn syrup), omega-6 seed oils (inflammatory, oxidative, and mutagenic), food additives (from glutamate to propionate), and nutrient-deficient, chemical-drenched agricultural crops (glyphosate among the worst). Assuming the dog got decent food, even eating dog shit would be better for your health than SAD.

Stating that veganism or the Mediterranean diet is healthier than what most people eat (SAD: standard American diet) really tells us nothing at all. That is even more true when the healthy user effect is not controlled for, as typically is the case with most studies. When comparing people on these diets to typical meat eaters, the ‘carnivores’ also are eating tons of plant-based carbs, sugar, and seed oils with their meat (buns, french fries, pop, etc; and, for cooking and in sauces, seed oils; not to mention snacking all day on chips, crackers, cookies, and candy). The average meat-eater consumes far more non-animal foods than animal foods, and most processed junk food is made mostly or entirely with vegan ingredients. So why do the animal foods get all the blame? And why does saturated fat get blamed when, starting back in the 1930s, seed oils replaced animal fats as the main source of fatty acids?

If scientists in this field were genuinely curious, intellectually humble, not ideologically blinded, and unbiased by big food and big farm funding, they would make honest and fair comparisons to a wide variety of optimally-designed diets. Nutritionists have known about low-carb, keto, and carnivore diets for about a century. The desire to research these diets, however, has been slim to none. The first ever study of the carnivore diet, including fully meat-based, is happening right now. To give some credit, research has slowly been improving. I came across a 2013 study that compared four diets: “vegetarian, carnivorous diet rich in fruits and vegetables, carnivorous diet less rich in meat, and carnivorous diet rich in meat” (Nathalie T. Burkert et al, Nutrition and Health – The Association between Eating Behavior and Various Health Parameters: A Matched Sample Study).

It’s still kind of amusing that the researchers called carnivorous a “diet rich in fruits and vegetables” and a “diet less rich in meat.” If people are mostly eating plant foods or otherwise not eating much meat, how exactly is that carnivorous in any meaningful and practical sense? Only one in four of the diets were carnivorous in the sense the average person would understand it, as a diet largely based on animal foods. Even then, it doesn’t include a carnivorous diet entirely based on animal foods. Those carnivores eating a “diet rich in meat” might still be eating plenty of processed junk food, their meat might still be cooked or slathered in harmful seed oils and come with a bun, and they might still be washing it down with sugary drinks. A McDonald’s Big Mac meal could be considered as part of a diet rich in meat, just because meat represents the greatest portion of weight and calories. Even if their diet was only 5-10% unhealthy plant foods, it could still be doing severe damage to their health. One can fit in a fairly large amount of carbs, seed oils, etc in a relatively small portion of the diet.

I’m reminded of research that defines a “low-carb diet” as any carb intake that is 40% or below, but other studies show that 40% is the absolute highest point of carb intake for most hunter-gatherers (discussed here with links to references). As high and low are relative concepts in defining carb intake, what is considered a meat-rich diet would be relative as well. I doubt these studied carnivorous “diets rich in meat” are including as high amount of animal foods as found in the diets of Inuit, Masai, early Americans, and Paleolithic humans. So what is actually being compared and tested? It’s not clear. This was further confounded in how vegans, vegetarians, and pescetarians (fish-eaters) were combined into a single group mislabeled as ‘vegetarian’, considering that vegetarians and pescetarians technically could eat a diet primarily animal-based if they so chose (dairy, eggs, and/or fish) and I know plenty of vegetarians who eat more cheese than they do fruits and vegetables. Nonetheless, at least these researchers were making a better comparison than most studies. They did try to control for other confounders such as pairing each person on a plant-based diet with “a subject of the same sex, age, and SES [socioeconomic status]” from each of the other three diets.

What were the results? Vegetarians, compared to the most meat-based of the diets, had worse outcomes for numerous health conditions: asthma, allergies, diabetes, cataracts, tinnitus, cardiac infarction, bronchitis, sacrospinal complaints, osteoporosis, gastric or intestinal ulcer, cancer, migraine, mental illness (anxiety disorder or depression), and “other chronic conditions.” There were only a few health conditions where the plant-based dieters fared better. For example, the so-called ‘vegetarians’ had lower rates of hypertension compared to carnivores rich in meat and less rich in meat, although higher rates than those carnivores rich in fruits and vegetables (i.e., more typical omnivores).

This is interesting evidence about the diets, though. If the carnivorous diets were low enough in starchy and sugary plant foods and low enough in dairy, they would be ketogenic which in studies is known to lower blood pressure and so would show a lesser rate of hypertension. This indicates that none of these diets are low-carb, much less very low-carb (ketogenic). The plant-based dieters in this study also had lower rates of stroke and arthritis, these being other health benefits seen on a ketogenic diet, and so this further demonstrates that this study wasn’t comparing high-carb vs low-carb as one might expect from how the diets were described in the paper. That is to say the researchers didn’t include a category for a ketogenic carnivore diet or even a ketogenic omnivore diet, much less a ketogenic ‘vegetarian’ diet as a control. Keep in mind that keto-carnivore is one of the most common forms of those intentionally following a carnivore diet. And keep in mind that plant-based keto is probably more popular right now than keto-carnivore. So, the point is that these unexpected results are examples of the complications with confounding factors.

The only other result that showed an advantage to the ‘vegetarians’ was less urinary incontinence, which simply means they didn’t have to pee as often. I haven’t a clue what that might mean. If we were talking about low-carb and keto, I’d suspect that the increased urination for the ‘carnivorous’ diets was related to decreased water retention (i.e., bloating) and hence the water loss that happens as metabolism shifts toward fat-burning. But since we are confident that such a diet wasn’t included in the study, these results remain anomalous. Of all the things that meat gets blamed for, I’ve never heard of anyone suggesting that it causes most people to urinate incessantly. That is odd. Anyway, it’s not exactly a life-threatening condition, even if it were caused by carnivory. It might have something to do with higher-fat combined with higher-carb, in the way that this combination also contributes to obesity, whereas high-fat/low-carb and low-fat/high-carb does not predispose one to fat gain. The ‘vegetarianism’ in this study was being conflated with a low-fat diet, but all of the four categories apparently were varying degrees of higher carb.

The basic conclusion is that ‘vegetarians’, including vegans and pescetarians, have on average poorer health across the board, with a few possible exceptions. In particular, they suffer more from chronic diseases and report higher impairment from health disorders. Also, not only these ‘vegetarians’ but also meat-eaters who ate a largely plant-based diet (“rich in fruits and vegetables”) consult doctors more often, even as ‘vegetarians’ are inconsistent about preventative healthcare such as check-ups and vaccinations. Furthermore, “subjects with a lower animal fat intake demonstrate worse health care practices,” whatever that exactly means. Generally, ‘vegetarians’ “have a lower quality of life.”

These are interesting results since the researchers were controlling for such things as wealth and poverty, and so it wasn’t an issue of access to healthcare or the quality of one’s environment or level of education. The weakness is that no data was gathered on macronutrient ratios of the subjects’ diets, and no testing was done on micronutrient content in the food and potential deficiencies in the individuals. Based on these results, no conclusions can be made about causal direction and mechanisms, but it does agree with some other research that finds similar results, including with other health conditions such as vegans and vegetarians having greater infertility. Any single one of these results, especially something like infertility, points toward serious health concerns involving deeper systemic disease and disorder within the body.

But what really stands out is the high rate of mental illness among ‘vegetarians’ (about 10%), twice as high as the average meat-eater (about 5%) which is to say the average Westerner, and that is with the background of the Western world having experienced a drastic rise in mental illness over the past couple of centuries. And the only mental illnesses considered in this study were depression and anxiety. The percentage would be so much higher if including all other psychiatric conditions and neurocognitive disorders (personality disorders, psychosis, psychopathy, Alzheimer’s, ADHD, autism, learning disabilities, etc). Think about that, the large number of people on a plant-based diet who are struggling on the most basic level of functioning, something I personally understand from decades of chronic depression on the SAD diet. Would you willingly choose to go on a diet that guaranteed a high probability of causing mental health struggles and suffering, neurocognitive issues and decline?

To put this study in context, listen to what Dr. Paul Saladino, trained in psychiatry and internal medicine, has to say in the following video. Jump to around the 19 minute mark where he goes into the nutritional angle of a carnivore diet. And by carnivore he is talking about fully carnivore and so, if dairy is restricted as he does in his own eating, it would also mean ketogenic as well. A keto-carnivore diet has never been studied. Hopefully, that will change soon. Until then, we have brilliant minds like that of Dr. Saladino to dig into the best evidence that is presently available.

Here are a couple of articles that come from the BBC. As a mainstream news source, this demonstrates how this knowledge is finally getting acknowledged in conventional healthcare and public debate. That is heartening.

[Text below is from linked articles.]

Why vegan junk food may be even worse for your health
by William Clark, BBC

There’s also the concern that the health risks associated with these kinds of nutrient deficiencies might not show up immediately. It could take years to associate foggy thoughts and tiredness with low B12 levels, infertility with low iron, and osteoporosis brought on by calcium deficiency does not show up until late 40s and 50s in most people, says Rossi.

“People will think about their health now and not their future health,” she says.

How a vegan diet could affect your intelligence
by Zaria Gorvett, BBC

In fact, there are several important brain nutrients that simply do not exist in plants or fungi. Creatine, carnosine, taurine, EPA and DHA omega-3 (the third kind can be found in plants), haem iron and vitamins B12 and D3 generally only occur naturally in foods derived from animal products, though they can be synthesised in the lab or extracted from non-animal sources such as algae, bacteria or lichen, and added to supplements.

Others are found in vegan foods, but only in meagre amounts; to get the minimum amount of vitamin B6 required each day (1.3 mg) from one of the richest plant sources, potatoes, you’d have to eat about five cups’ worth (equivalent to roughly 750g or 1.6lb). Delicious, but not particularly practical. […]

There are small amounts of choline in lots of vegan staples, but among the richest sources are eggs, beef and seafood. In fact, even with a normal diet, 90% of Americans don’t consume enough. According to unpublished research by Wallace, vegetarians have the lowest intakes of any demographic. “They have extremely low levels of choline, to the point where it might be concerning,” he says.

For vegans, the picture is likely to be bleaker still, since people who eat eggs tend to have almost double the choline levels of those who don’t. And though the US authorities have set suggested intakes, they might be way off.

Meat and mental health: a systematic review of meat abstention and depression, anxiety, and related phenomena
by Urska Dobersek et al

Conclusion: Studies examining the relation between the consumption or avoidance of meat and psychological health varied substantially in methodologic rigor, validity of interpretation, and confidence in results. The majority of studies, and especially the higher quality studies, showed that those who avoided meat consumption had significantly higher rates or risk of depression, anxiety, and/or self-harm behaviors. There was mixed evidence for temporal relations, but study designs and a lack of rigor precluded inferences of causal relations. Our study does not support meat avoidance as a strategy to benefit psychological health.

Native Americans Feasted Some But Mostly Fasted

“There are to be found, among us, a few strong men and women — the remnant of a by-gone generation, much healthier than our own — who can eat at random, as the savages do, and yet last on, as here and there a savage does, to very advanced years. But these random-shot eaters are, at most, but exceptions to the general rule, which requires regularity.”
~William Andrus Alcott, 1859

Three Squares: The Invention of the American Meal
by Abigail Carroll, pp. 12-14

Encountering the tribal peoples of North America, European explorers and settlers found themselves forced to question an institution they had long taken for granted: the meal. “[They] have no such thing as set meals breakfast, dinner or supper,” remarked explorer John Smith. Instead of eating at three distinct times every day, natives ate when their stomachs cued them, and instead of consuming carefully apportioned servings, they gleaned a little from the pot here and there. English colonists deplored this unstructured approach. They believed in eating according to rules and patterns—standards that separated them from the animal world. But when it came to structure, colonists were hardly in a position to boast. Though they believed in ordered eating, their meals were rather rough around the edges, lacking the kind of organization and form that typifies the modern meal today. Hardly well defined or clean-cut, colonial eating occasions were messy in more ways than one. Perhaps this partially explains why explorers and colonists were so quick to criticize native eating habits—in doing so, they hid the inconsistencies in their own. 3

Colonists found Native American eating habits wanting because they judged them by the European standard. For Europeans, a meal combined contrasting components—usually cereals, vegetables, and animal protein. Heat offered an additional desirable contrast. Swedish traveler Peter Kalm noted that many “meals” consumed by the natives of the mid-Atlantic, where he traveled in the mid-eighteenth century, consisted simply of “[maple] sugar and bread.” With only two ingredients and a distinct lack of protein, not to mention heat, this simplistic combination fell short of European criteria; it was more of a snack. Other typical nonmeals included traveling foods such as nocake (pulverized parched cornmeal to which natives added water on the go) and pemmican (a dense concoction of lean meat, fat, and sometimes dried berries). Hunters, warriors, and migrants relied on these foods, designed to be eaten in that particularly un-meal-like way in which John Williams ate his frozen meat on his journey to Québec: as the stomach required it and on the go. 4

Jerked venison and fat, chewed as one traversed the wilderness, was not most colonists’ idea of a proper meal, and if natives’ lack of sufficient contrasting components and the absence of a formal eating schedule puzzled colonists, even more mystifying was natives’ habit of going without meals, and often without any food at all, for extended periods. Jesuit missionary Christian LeClercq portrayed the Micmac of the Gaspé Peninsula in Canada as a slothful people, preserving and storing only a token winter’s supply: “They are convinced that fifteen to twenty lumps of meat, or of fish dried or cured in the smoke, are more than enough to support them for the space of five to six months.” LeClercq and many others did not realize that if natives went hungry, they did so not from neglect but by choice. Fasting was a subsistence strategy, and Native Americans were proud of it. 5

Throughout the year, Native Americans prepared for times of dearth by honing their fasting skills. They practiced hunger as a kind of athletic exercise, conditioning their bodies for the hardships of hunting, war, and seasonal shortages. According to artist George Catlin, the Mandan males in what are now the Dakotas “studiously avoided . . . every kind of excess.” An anthropologist among the Iroquois observed that they were “not great eaters” and “seldom gorged themselves.” To discourage gluttony, they even threatened their children with a visit from Sago’dakwus, a mythical monster that would humiliate them if it caught them in the act of overeating. 6

Native and European approaches to eating came to a head in the vice of gluttony. Many tribal peoples condemned overeating as a spiritual offense and a practice sure to weaken manly resolve and corrupt good character. Europeans also condemned it, largely for religious reasons, but more fundamentally because it represented a loss of control over the animal instincts. In the European worldview, overindulgence was precisely the opposite of civility, and the institution of the meal guarded against gluttony and a slippery descent into savagery. The meal gave order to and set boundaries around the act of eating, boundaries that Europeans felt native practices lacked. As explorers and colonists defended the tradition of the meal, the institution took on new meaning. For them, it became a subject of pride, serving as an emblem of civilization and a badge of European identity. 7

Europeans viewed Native Americans largely as gluttons. Because whites caught only fleeting glimpses of the complex and continually shifting lives of Native Americans, they were liable to portray the native way of life according to a single cultural snapshot, which, when it came to food, was the posthunt feast. It was well known that natives ate much and frequently during times of abundance. John Smith recorded that when natives returned from the hunt with large quantities of bear, venison, and oil, they would “make way with their provision as quick as possible.” For a short time, he explained, “they have plenty and do not spare eating.” White witnesses popularized the image of just such moments of plenty as typical. 8

Although Native Americans were hardly gluttons, Europeans, fascinated by the idea of a primitive people with a childlike lack of restraint, embraced the grossly inaccurate stereotype of the overeating Indian. William Wood portrayed the natives of southern New England as gorging themselves “till their bellies stand forth, ready to split with fullness.” A decidedly strange Anglo-American amusement involved watching Native Americans relish a meal. “Why,” asked George Catlin, “[is it] that hundreds of white folks will flock and crowd round a table to see an Indian eat?” With a hint of disappointment, William Wood recorded the appetites of tribes people invited to an English house to dine as “very moderate.” Wood was uncertain whether to interpret this reserve as politeness or timidity, but clearly he and his fellow English spectators had not expected shy and tempered eaters. 9

One culture’s perception of another often says more about the perceiver than the perceived. Although settlers lambasted natives for gluttony, whites may have been the real gluttons. According to more than one observer, many a native blushed at Europeans’ bottomless stomachs. “The large appetites of white men who visited them were often a matter of surprise to the Indians who entertained them,” wrote a nineteenth-century folklorist among the Iroquois. Early anthropologist Lewis Morgan concluded that natives required only about one-fifth of what white men consumed, and he was skeptical of his own ability to survive on such a paucity of provisions. 10

Through their criticisms, exaggerations, and stereotypes, colonists distanced themselves from a population whose ways appeared savage and unenlightened, and the organized meal provided a touchstone in this clash of cultures. It became a yardstick by which Europeans measured culture and a weapon by which they defended their definition of it. They had long known what a meal was, but now, by contrast, they knew firsthand what it was not. Encountering the perceived meal-less-ness of the natives brought the colonists’ esteemed tradition into question and gave them an opportunity to confirm their commitment to their conventions. They refused to approve of, let alone adapt to, the loose foodways of Native Americans and instead embraced all the more heartily a structured, meal-centered European approach to eating.

Old Debates Forgotten

Since earlier last year, I’ve done extensive reading, largely but not entirely focused on health. This has particularly concerned diet and nutrition, although it has crossed over into the territory of mental health with neurocognitive issues, addiction, autism, and much else, with my personal concern being that of depression. The point of this post is to consider some of the historical background. Before I get to that, let me explain how my recent interests have developed.

What got me heading in this direction was the documentary The Magic Pill. It’s about the paleo diet. The practical advice was worth the time spent, though other things drew me into the the larger arena of low-carb debate. The thing about the paleo diet is that it offers a framework of understanding that includes many scientific fields involving health beyond only diet and also it explores historical records, anthropological research, and archaeological evidence. The paleo diet community in particular, along with the low-carb diet community in general, is also influenced by the traditional foods approach of Sally Fallon Morrell. She is the lady who, more than anyone else, popularized the work of Weston A. Price, an early 20th century dentist who traveled the world and studied traditional populations. I was already familiar with this area from having reading Morrell’s first book in the late ’90s or early aughts.

New to me was the writings of Gary Taubes and Nina Teicholz, two science journalists who have helped to shift the paradigm in nutritional studies. They accomplished this task by presenting not only detailed surveys of the research and other evidence but in further contextualizing the history of powerful figures, institutions, and organizations that shaped the modern industrial diet. I didn’t realize how far back this debate went with writings on fasting for epilepsy found in ancient texts and recommendations of a low-carb diet (apparently ketogenic) for diabetes appearing in the 1790s, along with various low-carb and animal-based diets being popularized for weight-loss and general health during the 19th century, and then the ketogenic diet was studied for epilepsy beginning in the 1920s. Yet few know this history.

Ancel Keys was one of those powerful figures who, in suppressing his critics and silencing debate, effectively advocated for the standard American diet of high-carbs, grains, fruits, vegetables, and industrial seed oils. In The Magic Pill, more recent context is given in following the South African trial of Tim Noakes. Other documentaries have covered this kind of material, often with interviews with Gary Taubes and Nina Teicholz. There has been immense drama involved and, in the past, there was also much public disagreement and discussion. Only now is that returning to mainstream awareness in the corporate media, largely because social media has forced it out into the open. But what interests me is how old is the debate and often in the past much more lively.

The post-revolutionary era created a sense of crisis that, by the mid-19th century, was becoming a moral panic. The culture wars were taking shape. The difference back then was that there was much more of a sense of the connection between physical health, mental health, moral health, and societal health. As a broad understanding, health was seen as key and this was informed by the developing scientific consciousness and free speech movement. The hunger for knowledge was hard to suppress, although there were many attempts as the century went on. I tried to give a sense of this period in two massive posts, The Crisis of Identity and The Agricultural Mind. It’s hard to imagine what that must’ve been like. That scientific debate and public debate was largely shut down around the World War era, as the oppressive Cold War era took over. Why?

It is strange. The work of Taubes and Teicholz gives hint to what changed, although the original debate was much wider than diet and nutrition. The info I’ve found about the past has largely come from scholarship in other fields, such as historical and literary studies. Those older lines of thought are mostly treated as historical curiosities at this point, background info for the analysis of entirely other subjects. As for the majority of scientists, doctors and nutritionists these days, they are almost entirely ignorant of the ideologies that shaped modern thought about disease and health.

This is seen, as I point out, in how Galen’s ancient Greek theory of humors as incorporated into Medieval Christianity appears to be the direct source of the basic arguments for a plant-based diet, specifically in terms of the scapegoating of red meat, saturated fat and cholesterol. Among what I’ve come across, the one scholarly book that covers this in detail is Food and Faith in Christian Culture edited by Ken Albala and Trudy Eden. Bringing that into present times, Belinda Fettke dug up how so much of contemporary nutritional studies and dietary advice was built on the foundation of 19th-20th century vegan advocacy by the Seventh Day Adventists. I’ve never met anyone adhering to “plant-based” ideology who knows this history. Yet now it is becoming common knowledge in the low-carb world.

On the literary end of things, there is a fascinating work by Bryan Kozlowski, The Jane Austen Diet. I enjoyed reading it, in spite of never having cracked open a book by Jane Austen. Kozlowski, although no scholar, was able to dredge up much of interest about those post-revolutionary decades in British society. For one, he shows how obesity was becoming noticeable all the way back then and many were aware of the benefits of low-carb diets. He also makes clear that the ability to maintain a vegetable garden was a sign of immense wealth, not a means for putting much food on the tables of the poor — this is corroborated by Teicholz discussion of how gardening in American society, prior to modern technology and chemicals, was difficult and not dependable. More importantly, Kozlowski’s book explains what ‘sensibility’ meant back then, related to ‘nerves’ and ‘vapors’ and later on given the more scientific-sounding label of ‘neurasthenia’.

I came across another literary example of historical exegesis about health and diet, Sander L. Gilman’s Franz Kafka, the Jewish Patient. Kafka was an interesting case, as a lifelong hypochondriac who, it turns out, had good reason to be. He felt that he had inherited a weak constitution and blamed this on his psychological troubles, but more likely causes were urbanization, industrialization, and a vegetarian diet that probably also was a high-carb diet based on nutrient-depleted processed foods; and before the time when industrial foods were fortified and many nutritional supplements were available.

What was most educational, though, about the text was Gilman’s historical details on tuberculosis in European thought, specifically in relationship to Jews. To some extent, Kafka had internalized racial ideology and that is unsurprising. Eugenics was in the air and racial ideology penetrated everything, especially health in terms of racial hygiene. Even for those who weren’t eugenicists, all debate of that era was marked by the expected biases and limitations. Some theorizing was better than others and for certain not all of it was racist, but the entire debate maybe was tainted by the events that would follow. With the defeat of the Nazis, eugenics fell out of favor for obvious reasons and an entire era of debate was silenced, even many of the arguments that were opposed to or separate from eugenics. Then historical amnesia set in, as many people wanted to forget the past and instead focus on the future. That was unfortunate. The past doesn’t simply disappear but continues to haunt us.

That earlier debate was a struggle between explanations and narratives. With modernity fully taking hold, people wanted to understand what was happening to humanity and where it was heading. It was a time of contrasts which made the consequences of modernity quite stark. There were plenty of communities that were still pre-industrial, rural, and traditional, but since then most of these communities have died away. The diseases of civilization, at this point, have become increasingly normalized as living memory of anything else has disappeared. It’s not that the desire for ideological explanations has disappeared. What happened was, with the Ally victory of World War II and the ensuing propaganda of the Cold War, a particular grand narrative came to dominate the entire Western world and there simply were no other grand narratives to compete with it. Much of the pre-war debate and even scientific knowledge, especially in Europe, was forgotten as the records of it were destroyed, weren’t translated, or lost perceived relevance.

Nonetheless, all of those old ideological conflicts were left unresolved. The concerns then are still concerns now. So many problems worried about back then are getting worse. The connections between various aspects of health have regained their old sense of urgency. The public is once again challenging authorities, questioning received truths, and seeking new meaning. The debate never ended and here we are again, and one could add that fascism also is back rearing its ugly head. It’s worrisome that the political left seems to be slow on the uptake. There are reactionary right-wingers like Jordan Peterson who are offering visions of meaning and also who have become significant figures in the dietary world, by way of the carnivore diet he and his daughter are on. Then there are the conspiratorial paleo-libertarians such as Tristan Haggard, another carnivore advocate.

This is far from being limited to carnivory and the low-carb community includes those across the political spectrum, but it seems to be the right-wingers who are speaking the loudest. The left-wingers who are speaking out on diet come from the confluence of veganism/vegetarianism and environmentalism, as seen with EAT-Lancet (Dietary Dictocrats of EAT-Lancet). The problem with this, besides much of this narrative being false (Carnivore is Vegan), is that it is disconnected from the past. If with immense distortion, the right-wing is speaking more to the past than is the left-wing, such as Trump’s ability to invoke and combine the Populist and Progressive rhetoric from earlier last century. The political left is struggling to keep up and is being led down ideological dead-ends.

If we want to understand our situation now, we better study carefully what was happening in centuries past. We keep having the same old debates without realizing it and we very well might see them lead to the same kinds of unhappy results with authoritarianism and totalitarianism, maybe even once again eugenics, genocide, and world war or some similar horrors of mass atrocities and crimes against humanity. One would like to believe, though, that such is not an inevitable fate. There doesn’t appear to be anything stopping us from choosing otherwise. We always could seek to have different debates or, at the very least, to put past debates into new context based on emerging scientific knowledge and understandings.

Moral Panic and Physical Degeneration

From the beginning of the country, there has been an American fear of moral and mental decline that was always rooted in the physical, involving issues of vitality of land and health of the body, and built on an ancient divide between the urban and rural. Over time, it grew into a fever pitch of moral panic about degeneration and degradation of the WASP culture, the white race, and maybe civilization itself. Some saw the end was near, maybe being able to hold out for another few generations before finally succumbing to disease and weakness. The need for revitalization and rebirth became a collective project (Jackson Lears, Rebirth of a Nation), which sadly fed into ethno-nationalist bigotry and imperialistic war-mongering — Make America Great Again!

A major point of crisis, of course, was the the Civil War. Racial ideology became predominant, not only because of slavery but maybe moreso because of mass immigration, the latter being the main reason the North won. Racial tensions merged with the developing scientific mindset of Darwinism and out of this mix came eugenics. For all we can now dismiss this kind of simplistic ignorance and with hindsight see the danger it led to, the underlying anxieties were real. Urbanization and industrialization were having an obvious impact on public health that was observed by many, and it wasn’t limited to mere physical ailments. “Cancer, like insanity, seems to increase with the progress of civilization,” noted Stanislas Tanchou, a mid-19th century French physician.

The diseases of civilization, including mental sickness, have been spreading for centuries (millennia, actually, considering the ‘modern’ chronic health conditions were first detected in the mummies of the agricultural Egyptians). Consider how talk of depression suddenly showed up in written accounts with the ending of feudalism (Barbara Ehrenreich, Dancing in the Street). That era included the enclosure movement that forced millions of then landless serfs into the desperate conditions of crowded cities and colonies where they faced stress, hunger, malnutrition, and disease. The loss of rural life hit Europe much earlier than America, but it eventually came here as well. The majority of white Americans were urban by the beginning of the 20th century and the majority of black Americans were urban by the 1970s. There has been a consistent pattern of mass problems following urbanization, everywhere it happens. It still is happening. The younger generation, more urbanized than any generation before, are seeing rising rates of psychosis that is specifically concentrated in the most urbanized areas.

In the United States, it was the last decades of the 19th century that was the turning point, the period of the first truly big cities. Into this milieu, Weston A. Price was born (1870) in a small rural village in Canada. As an adult, he became a dentist and sought work in Cleveland, Ohio (1893). Initially, most of his patients probably had, like him, grown up in rural areas. But over the decades, he increasingly was exposed to the younger generations having spent their entire lives in the city. Lierre Keith puts Price’s early observations in context, after pointing out that he started his career in 1893: “This date is important, as he entered the field just prior to the glut of industrial food. Over the course of the next thirty years, he watched children’s dentition — and indeed their overall health deteriorate. There was suddenly children whose teeth didn’t fit in their mouths, children with foreshortened jaws, children with lots of cavities. Not only were their dental arches too small, but he noticed their nasal passages were also too narrow, and they had poor health overall; asthma, allergies, behavioral problems” (The Vegetarian Myth, p. 187). This was at the time when the industrialization of farming and food had reached a new level, far beyond the limited availability of canned foods that in the mid-to-late 1800s when most Americans still relied on a heavy amount of wild-sourced meat, fish, nuts, etc. Even city-dwellers in early America had ready access to wild game because of the abundance of surrounding wilderness areas. In fact, in the 19th century, the average American ate more meat (mostly hunted) than bread.

We are once again coming back to the ever recurrent moral panic about the civilizational project. The same fears given voice in the late 19th to early 20th century are being repeated again. For example, Dr. Leonard Sax alerts us to how girls are sexually maturing early (1% of female infants showing signs of puberty), whereas boys are maturing later. As a comparison, hunter-gatherers don’t have such a large gender disparity of puberty nor do they experience puberty so early for girls, instead both genders typically coming to puberty around 18 years old with sex, pregnancy, and marriage happening more or less simultaneously. Dr. Sax, along with others, speculates about a number of reasons. Common causes that are held responsible include health factors, from diet to chemicals. Beyond altered puberty, many other examples could be added: heart disease, autoimmune disorders, mood disorders, autism, ADHD, etc; all of them increasing and worsening with each generation (e.g., type 2 diabetes used to be known as adult onset diabetes but now is regularly diagnosed in young children; the youngest victim recorded recently was three years old when diagnosed).

In the past, Americans responded to moral panic with genocide of Native Americans, Prohibition targeting ethnic (hyphenated) Americans and the poor, and immigrant restrictions to keep the bad sort out; the spread of racism and vigilantism such as KKK and Jim Crow and sundown towns and redlining, forced assimilation such as English only laws and public schools, and internment camps for not only Japanese-Americans but also German-Americans and Italian-Americans; implementation of citizen-making projects like national park systems, Boy Scouts, WPA, and CCC; promotion of eugenics, war on poverty (i.e., war on the poor), imperial expansionism, neo-colonial exploitation, and world wars; et cetera. The cure sought was often something to be forced onto the population by a paternalistic elite, that is to say rich white males, most specifically WASPs of the capitalist class.

Eugenics was, of course, one of the main focuses as it carried the stamp of science (or rather scientism). Yet at the same time, there were those challenging biological determinism and race realism, as views shifted toward environmental explanations. The anthropologists were at the front lines of this battle, but there were also Social Christians who changed their minds after having seen poverty firsthand. Weston A. Price, however, didn’t come to this from a consciously ideological position or religious motivation. He was simply a dentist who couldn’t ignore the severe health issues of his patients. So, he decided to travel the world in order to find healthy populations to study, in the hope of explaining why the change had occurred (Nutrition and Physical Degeneration).

Although familiar with eugenics literature, what Price observed in ‘primitive’ communities (including isolated villages in Europe) did not conform to eugenicist thought. It didn’t matter which population he looked at. Those who ate traditional diets were healthy and those who ate an industrialized Western diet were not. And it was a broad pattern that he saw everywhere he went, not only physical health but also neurocognitive health as indicated by happiness, low anxiety, and moral character. Instead of blaming individuals or races, he saw the common explanation as nutrition and he made a strong case by scientifically analyzing the nutrition of available foods.

In reading about traditional foods, paleo diet/lifestyle and functional medicine, Price’s work comes up quite often. He took many photographs that compared people from healthy and unhealthy populations. The contrast is stark. But what really stands out is how few people in the modern world look close to as healthy as those from the healthiest societies of the past. I live in a fairly wealthy college and medical town where there is a far above average concern for health along with access to healthcare. Even so, I now can’t help noticing how many people around me show signs of stunted or perturbed development of the exact kind Price observed in great detail: thin bone structure, sunken chests, sloping shoulders, narrow facial features, asymmetry, etc. That is even with modern healthcare correcting some of the worst conditions: cavities, underbites, pigeon-toes, etc. My fellow residents in this town are among the most privileged people in the world and, nonetheless, their state of health is a sad state of affairs in what it says about humanity at present.

It makes me wonder, as it made Price wonder, what consequences this has on neurocognitive health for individuals and the moral health of society. Taken alone, it isn’t enough to get excited about. But put in a larger context of looming catastrophes and it does become concerning. It’s not clear that our health will be up to the task of the problems we need to solve. We are a sickly population, far more sickly than when moral panic took hold in past generations.

As important, there is the personal component. I’m at a point where I’m not going to worry too much about decline and maybe collapse of civilization. I’m kind of hoping the American Empire will meet its demise. Still, that leaves us with many who suffer, no matter what happens to society as a whole. I take that personally, as one who has struggled with physical and mental health issues. And I’ve come around to Price’s view of nutrition as being key. I see these problems in other members of my family and it saddens me to watch as health conditions seem to get worse from one generation to the next.

It’s far from being a new problem, the central point I’m trying to make here. Talking to my mother, she has a clear sense of the differences on the two sides of her family. Her mother’s family came from rural areas and, even after moving to a larger city for work, they continued to hunt on a daily basis as there were nearby fields and woods that made that possible. They were a healthy, happy, and hard-working lot. They got along well as a family. Her father’s side of the family was far different. They had been living in towns and cities for several generations by the time she was born. They didn’t hunt at all. They were known for being surly, holding grudges, and being mean drunks. They also had underbites (i.e., underdeveloped jaw structure) and seemed to have had learning disabilities, though no one was diagnosing such conditions back then. Related to this difference, my mother’s father raised rabbits whereas my mother’s mother’s family hunted rabbits (and other wild game). This makes a big difference in terms of nutrition, as wild game has higher levels of omega-3 fatty acids and fat-soluble vitamins, all of which are key to optimal health and development.

What my mother observed in her family is basically the same as what Price observed in hundreds of communities in multiple countries on every continent. And I now observe the same pattern repeating. I grew up with an underbite. My brothers and I all required orthodontic work, as do so many now. I was diagnosed with a learning disability when young. Maybe not a learning disability, but behavioral issues were apparent when my oldest brother was young, likely related to his mildew allergies and probably an underlying autoimmune condition. I know I had food allergies as a child, as I think my other brother did as well. All of us have had neurocognitive and psychological issues of a fair diversity, besides learning disabilities: stuttering, depression, anxiety, and maybe some Asperger’s.

Now another generation is coming along with increasing rates of major physical and mental health issues. My nieces and nephews are sick all the time. They don’t eat well and are probably malnourished. During a medical checkup for my nephew, my mother asked the doctor about his extremely unhealthy diet, consisting mostly of white bread and sugar. The doctor bizarrely dismissed it as ‘normal’ in that, as she claimed, no kid eats healthy. If that is the new normal, maybe we should be in a moral panic.

* * *

Violent Behavior: A Solution in Plain Sight
by Sylvia Onusic

Nutrition and Mental Development
by Sally Fallon Morell

You Are What You Eat: The Research and Legacy of Dr. Weston Andrew Price
by John Larabell

While practicing in his Cleveland office, Dr. Price noticed an increase in dental problems among the younger generations. These issues included the obvious dental caries (cavities) as well as improper jaw development leading to crowded, crooked teeth. In fact, the relatively new orthodontics industry was at that time beginning to gain popularity. Perplexed by these modern problems that seemed to be affecting a greater and greater portion of the population, Dr. Price set about to research the issue by examining people who did not display such problems. He suspected (correctly, as he would later find) that many of the dental problems, as well as other degenerative health problems, that were plaguing modern society were the result of inadequate nutrition owing to the increasing use of refined, processed foods.

Nasty, Brutish and Short?
by Sally Fallon Morell

It seems as if the twentieth century will exit with a crescendo of disease. Things were not so bad back in the 1930’s, but the situation was already serious enough to cause one Cleveland, Ohio dentist to be concerned. Dr. Weston Price was reluctant to accept the conditions exhibited by his patients as normal. Rarely did an examination of an adult patient reveal anything but rampant decay, often accompanied by serious problems elsewhere in the body, such as arthritis, osteoporosis, diabetes, intestinal complaints and chronic fatigue. (They called it neurasthenia in Price’s day.) But it was the dentition of younger patients that alarmed him most. Price observed that crowded, crooked teeth were becoming more and more common, along with what he called “facial deformities”-overbites, narrowed faces, underdevelopment of the nose, lack of well-defined cheekbones and pinched nostrils. Such children invariably suffered from one or more complaints that sound all too familiar to mothers of the 1990’s: frequent infections, allergies, anemia, asthma, poor vision, lack of coordination, fatigue and behavioral problems. Price did not believe that such “physical degeneration” was God’s plan for mankind. He was rather inclined to believe that the Creator intended physical perfection for all human beings, and that children should grow up free of ailments.

Is it Mental or is it Dental?
by Raymond Silkman

The widely held model of orthodontics, which considers developmental problems in the jaws and head to be genetic in origin, never made sense to me. Since they are wedded to the genetic model, orthodontists dealing with crowded teeth end up treating the condition with tooth extraction in a majority of the cases. Even though I did not resort to pulling teeth in my practice, and I was using appliances to widen the jaws and getting the craniums to look as they should, I still could not come up with the answer as to why my patients looked the way they did. I couldn’t believe that the Creator had given them a terrible blueprint –it just did not make sense. In four years of college education, four years of dental school education and almost three years of post-graduate orthodontic training, students never hear a mention of Dr. Price, so they never learn the true reasons for these malformations. I have had the opportunity to work with a lot of very knowledgeable doctors in various fields of allopathic and alternative healthcare who still do not know about Dr. Price and his critical findings.

These knowledgeable doctors have not stared in awe at the beautiful facial development that Price captured in the photographs he took of primitive peoples throughout the globe and in so doing was able to answer this most important question: What do humans look like in health? And how have humans been able to carry on throughout history and populate such varied geographical and physical environments on the earth without our modern machines and tools?

The answer that Dr. Price was able to illuminate came through his photographs of beautiful, healthy human beings with magnificent physical form and mental development, living in harmony with their environments. […]

People who are not well oxygenated and who have poor posture often suffer from fatigue and fibromyalgia symptoms, they snore and have sleep apnea, they have sinusitis and frequent ear infections. Life becomes psychologically and physically challenging for them and they end up with long-term dependence on medications—and all of that just from the seemingly simple condition of crowded teeth.

In other words, people with poor facial development are not going to live very happily. […]

While very few people have heard of the work of Weston Price these days, we haven’t lost our ability to recognize proper facial form. To make it in today’s society, you must have good facial development. You’re not going to see a general or a president with a weak chin, you’re not going to see coaches with weak chins, you’re not going to see a lot of well-to-do personalities in the media with underdeveloped faces and chins. You don’t see athletes and newscasters with narrow palates and crooked teeth.

Weston A. Price: An Unorthodox Dentist
by Nourishing Israel

Price discovered that the native foods eaten by the isolated populations were far more nutrient dense than the modern foods. In the first generation that changed their diet there was noticeable tooth decay; in subsequent generations the dental and facial bone structure changed, as well as other changes that were seen in American and European families and previously considered to be the result of interracial marriage.

By studying the different routes that the same populations had taken – traditional versus modern diet – he saw that the health of the children is directly related to the health of the parents and the germ plasms that they provide, and are as important to the child’s makeup as the health of the mother before and during pregnancy.

Price also found that primitive populations were very conscious of the importance of the mothers’ health and many populations made sure that girls were given a special diet for several months before they were allowed to marry.

Another interesting finding was that although genetic makeup was important, it did not have as great a degree of influence on a person’s development and health as was thought, but that a lot of individual characteristics, including brain development and brain function, where due to environmental influence, what he called “intercepted heredity”.

The origin of personality and character appear in the light of the newer date to be biologic products and to a much less degree than usually considered pure hereditary traits. Since these various factors are biologic, being directly related to both the nutrition of the parents and to the nutritional environment of the individuals in the formative and growth period any common contributing factor such as food deficiencies due to soil depletion will be seen to produce degeneration of the masses of people due to a common cause. Mass behavior therefore, in this new light becomes the result of natural forces, the expression of which may not be modified by propaganda but will require correction at the source. [1] …

It will be easy for the reader to be prejudiced since many of the applications suggested are not orthodox. I suggest that conclusions be deferred until the new approach has been used to survey the physical and mental status of the reader’s own family, of his brothers and sisters, of associated families, and finally, of the mass of people met in business and on the street. Almost everyone who studies the matter will be surprised that such clear-cut evidence of a decline in modern reproductive efficiency could be all about us and not have been previously noted and reviewed.[2]

From Nutrition and Physical Degeneration by Weston Price

Food Freedom – Nourishing Raw Milk
by Lisa Virtue

In 1931 Price visited the people of the Loetschental Valley in the Swiss Alps. Their diet consisted of rye bread, milk, cheese and butter, including meat once a week (Price, 25). The milk was collected from pastured cows, and was consumed raw: unpasteurized, unhomogenized (Schmid, 9).

Price described these people as having “stalwart physical development and high moral character…superior types of manhood, womanhood and childhood that Nature has been able to produce from a suitable diet and…environment” (Price, 29). At this time, Tuberculosis had taken more lives in Switzerland than any other disease. The Swiss government ordered an inspection of the valley, revealing not a single case. No deaths had been recorded from Tuberculosis in the history of the Loetschental people (Shmid, 8). Upon return home, Price had dairy samples from the valley sent to him throughout the year. These samples were higher in minerals and vitamins than samples from commercial (thus pasteurized) dairy products in America and the rest of Europe. The Loetschental milk was particularly high in fat soluble vitamin D (Schmid, 9).

The daily intake of calcium and phosphorous, as well as fat soluble vitamins would have been higher than average North American children. These children were strong and sturdy, playing barefoot in the glacial waters into the late chilly evenings. Of all the children in the valley eating primitive foods, cavities were detected at an average of 0.3 per child (Price, 25). This without visiting a dentist or physician, for the valley had none, seeing as there was no need (Price, 23). To offer some perspective, the rate of cavities per child between the ages of 6-19 in the United States has been recorded to be 3.25, over 10 times the rate seen in Loetschental (Nagel).

Price offers some perspective on a society subsisting mainly on raw dairy products: “One immediately wonders if there is not something in the life-giving vitamins and minerals of the food that builds not only great physical structures within which their souls reside, but builds minds and hearts capable of a higher type of manhood…” (Price, 26).

100 Years Before Weston Price
by Nancy Henderson

Like Price, Catlin was struck by the beauty, strength and demeanor of the Native Americans. “The several tribes of Indians inhabiting the regions of the Upper Missouri. . . are undoubtedly the finest looking, best equipped, and most beautifully costumed of any on the Continent.” Writing of the Blackfoot and Crow, tribes who hunted buffalo on the rich glaciated soils of the American plains, “They are the happiest races of Indian I have met—picturesque and handsome, almost beyond description.”

“The very use of the word savage,” wrote Catlin, “as it is applied in its general sense, I am inclined to believe is an abuse of the word, and the people to whom it is applied.” […]

As did Weston A. Price one hundred years later, Catlin noted the fact that moral and physical degeneration came together with the advent of civilized society. In his late 1830s portrait of “Pigeon’s Egg Head (The Light) Going to and Returning from Washington” Catlin painted him corrupted with “gifts of the great white father” upon his return to his native homeland. Those gifts including two bottles of whiskey in his pockets. […]

Like Price, Catlin discusses the issue of heredity versus environment. “No diseases are natural,” he writes, “and deformities, mental and physical, are neither hereditary nor natural, but purely the result of accidents or habits.”

So wrote Dr. Price: “Neither heredity nor environment alone cause our juvenile delinquents and mental defectives. They are cripples, physically, mentally and morally, which could have and should have been prevented by adequate education and by adequate parental nutrition. Their protoplasm was not normally organized.”

The Right Price
by Weston A. Price Foundation

Many commentators have criticized Price for attributing “decline in moral character” to malnutrition. But it is important to realize that the subject of “moral character” was very much on the minds of commentators of his day. As with changes in facial structure, observers in the first half of the 20th century blamed “badness” in people to race mixing, or to genetic defects. Price quotes A.C. Jacobson, author of a 1926 publication entitled Genius (Some Revaluations),35 who stated that “The Jekyll-Hydes of our common life are ethnic hybrids.” Said Jacobson, “Aside from the effects of environment, it may safely be assumed that when two strains of blood will not mix well a kind of ‘molecular insult’ occurs which the biologists may some day be able to detect beforehand, just as blood is now tested and matched for transfusion.” The implied conclusion to this assertion is that “degenerates” can be identified through genetic testing and “weeded out” by sterilizing the unfit–something that was imposed on many women during the period and endorsed by powerful individuals, including Oliver Wendell Holmes.

It is greatly to Price’s credit that he objected to this arrogant point of view: “Most current interpretations are fatalistic and leave practically no escape from our succession of modern physical, mental and moral cripples. . . If our modern degeneration were largely the result of incompatible racial stocks as indicated by these premises, the outlook would be gloomy in the extreme.”36 Price argued that nutritional deficiencies affecting the physical structure of the body can also affect the brain and nervous system; and that while “bad” character may be the result of many influences–poverty, upbringing, displacement, etc.–good nutrition also plays a role in creating a society of cheerful, compassionate individuals.36

Rebirth of a Nation:
The Making of Modern America, 1877-1920
By Jackson Lears
pp. 7-9

By the late nineteenth century, dreams of rebirth were acquiring new meanings. Republican moralists going back to Jefferson’s time had long fretted about “overcivilization,” but the word took on sharper meaning among the middle and upper classes in the later decades of the nineteenth century. During the postwar decades, “overcivilization” became not merely a social but an individual condition, with a psychiatric diagnosis. In American Nervousness (1880), the neurologist George Miller Beard identified “neurasthenia,” or “lack of nerve force,” as the disease of the age. Neurasthenia encompassed a bewildering variety of symptoms (dyspepsia, insomnia, nocturnal emissions, tooth decay, “fear of responsibility, of open places or closed places, fear of society, fear of being alone, fear of fears, fear of contamination, fear of everything, deficient mental control, lack of decision in trifling matters, hopelessness”), but they all pointed to a single overriding effect: a paralysis of the will.

The malady identified by Beard was an extreme version of a broader cultural malaise—a growing sense that the Protestant ethic of disciplined achievement had reached the end of its tether, had become entangled in the structures of an increasingly organized capitalist society. Ralph Waldo Emerson unwittingly predicted the fin de siècle situation. “Every spirit makes its house,” he wrote in “Fate” (1851), “but afterwards the house confines the spirit.” The statement presciently summarized the history of nineteenth-century industrial capitalism, on both sides of the Atlantic.

By 1904, the German sociologist Max Weber could put Emerson’s proposition more precisely. The Protestant ethic of disciplined work for godly ends had created an “iron cage” of organizations dedicated to the mass production and distribution of worldly goods, Weber argued. The individual striver was caught in a trap of his own making. The movement from farm to factory and office, and from physical labor outdoors to sedentary work indoors, meant that more Europeans and North Americans were insulated from primary processes of making and growing. They were also caught up in subtle cultural changes—the softening of Protestantism into platitudes; the growing suspicion that familiar moral prescriptions had become mere desiccated, arbitrary social conventions. With the decline of Christianity, the German philosopher Friedrich Nietzsche wrote, “it will seem for a time as though all things had become weightless.”

Alarmists saw these tendencies as symptoms of moral degeneration. But a more common reaction was a diffuse but powerful feeling among the middle and upper classes—a sense that they had somehow lost contact with the palpitating actuality of “real life.” The phrase acquired unprecedented emotional freight during the years around the turn of the century, when reality became something to be pursued rather than simply experienced. This was another key moment in the history of longing, a swerve toward the secular. Longings for this-worldly regeneration intensified when people with Protestant habits of mind (if not Protestant beliefs) confronted a novel cultural situation: a sense that their way of life was being stifled by its own success.

On both sides of the Atlantic, the drive to recapture “real life” took myriad cultural forms. It animated popular psychotherapy and municipal reform as well as avant-garde art and literature, but its chief institutional expression was regeneration through military force. As J. A. Hobson observed in Imperialism (1902), the vicarious identification with war energized jingoism and militarism. By the early twentieth century, in many minds, war (or the fantasy of it) had become the way to keep men morally and physically fit. The rise of total war between the Civil War and World War I was rooted in longings for release from bourgeois normality into a realm of heroic struggle. This was the desperate anxiety, the yearning for rebirth, that lay behind official ideologies of romantic nationalism, imperial progress, and civilizing mission—and that led to the trenches of the Western Front.

Americans were immersed in this turmoil in peculiarly American ways. As the historian Richard Slotkin has brilliantly shown, since the early colonial era a faith in regeneration through violence underlay the mythos of the American frontier. With the closing of the frontier (announced by the U.S. census in 1890), violence turned outward, toward empire. But there was more going on than the refashioning of frontier mythology. American longings for renewal continued to be shaped by persistent evangelical traditions, and overshadowed by the shattering experience of the Civil War. American seekers merged Protestant dreams of spiritual rebirth with secular projects of purification—cleansing the body politic of secessionist treason during the war and political corruption afterward, reasserting elite power against restive farmers and workers, taming capital in the name of the public good, reviving individual and national vitality by banning the use of alcohol, granting women the right to vote, disenfranchising African-Americans, restricting the flow of immigrants, and acquiring an overseas empire.

Of course not all these goals were compatible. Advocates of various versions of rebirth—bodybuilders and Prohibitionists, Populists and Progressives, Social Christians and Imperialists—all laid claims to legitimacy. Their crusades met various ends, but overall they relieved the disease of the fin de siècle by injecting some visceral vitality into a modern culture that had seemed brittle and about to collapse. Yearning for intense experience, many seekers celebrated Force and Energy as ends in themselves. Such celebrations could reinforce militarist fantasies but could also lead in more interesting directions—toward new pathways in literature and the arts and sciences. Knowledge could be revitalized, too. William James, as well as Houdini and Roosevelt, was a symbol of the age.

The most popular forms of regeneration had a moral dimension.

pp. 27-29

But for many other observers, too many American youths—especially among the upper classes—had succumbed to the vices of commerce: the worship of Mammon, the love of ease. Since the Founding Fathers’ generation, republican ideologues had fretted about the corrupting effects of commercial life. Norton and other moralists, North and South, had imagined war would provide an antidote. During the Gilded Age those fears acquired a peculiarly palpable intensity. The specter of “overcivilization”—invoked by republican orators since Jefferson’s time—developed a sharper focus: the figure of the overcivilized businessman became a stock figure in social criticism. Flabby, ineffectual, anxious, possibly even neurasthenic, he embodied bourgeois vulnerability to the new challenges posed by restive, angry workers and waves of strange new immigrants. “Is American Stamina Declining?” asked William Blaikie, a former Harvard athlete and author of How to Get Strong and Stay So, in Harper’s in 1889. Among white-collar “brain-workers,” legions of worried observers were asking similar questions. Throughout the country, metropolitan life for the comfortable classes was becoming a staid indoor affair. Blaikie caught the larger contours of the change:

“A hundred years ago, there was more done to make our men and women hale and vigorous than there is to-day. Over eighty per cent of all our men then were farming, hunting, or fishing, rising early, out all day in the pure, bracing air, giving many muscles very active work, eating wholesome food, retiring early, and so laying in a good stock of vitality and health. But now hardly forty per cent are farmers, and nearly all the rest are at callings—mercantile, mechanical, or professional—which do almost nothing to make one sturdy and enduring.”

This was the sort of anxiety that set men (and more than a few women) to pedaling about on bicycles, lifting weights, and in general pursuing fitness with unprecedented zeal. But for most Americans, fitness was not merely a matter of physical strength. What was equally essential was character, which they defined as adherence to Protestant morality. Body and soul would be saved together.

This was not a gender-neutral project. Since the antebellum era, purveyors of conventional wisdom had assigned respectable women a certain fragility. So the emerging sense of physical vulnerability was especially novel and threatening to men. Manliness, always an issue in Victorian culture, had by the 1880s become an obsession. Older elements of moral character continued to define the manly man, but a new emphasis on physical vitality began to assert itself as well. Concern about the over-soft socialization of the young promoted the popularity of college athletics. During the 1880s, waves of muscular Christianity began to wash over campuses.

pp. 63-71

NOT MANY AMERICAN men, even among the comparatively prosperous classes, were as able as Carnegie and Rockefeller to master the tensions at the core of their culture. Success manuals acknowledged the persistent problem of indiscipline, the need to channel passion to productive ends. Often the language of advice literature was sexually charged. In The Imperial Highway (1881), Jerome Bates advised:

[K]eep cool, have your resources well in hand, and reserve your strength until the proper time arrives to exert it. There is hardly any trait of character or faculty of intellect more valuable than the power of self-possession, or presence of mind. The man who is always “going off” unexpectedly, like an old rusty firearm, who is easily fluttered and discomposed at the appearance of some unforeseen emergency; who has no control over himself or his powers, is just the one who is always in trouble and is never successful or happy.

The assumptions behind this language are fascinating and important to an understanding of middle-and upper-class Americans in the Gilded Age. Like many other purveyors of conventional wisdom—ministers, physicians, journalists, health reformers—authors of self-help books assumed a psychic economy of scarcity. For men, this broad consensus of popular psychology had sexual implications: the scarce resource in question was seminal fluid, and one had best not be diddling it away in masturbation or even nocturnal emissions. This was easier said than done, of course, as Bates indicated, since men were constantly addled by insatiable urges, always on the verge of losing self-control—the struggle to keep it was an endless battle with one’s own darker self. Spiritual, psychic, and physical health converged. What Freud called “‘civilized’ sexual morality” fed directly into the “precious bodily fluids” school of health management. The man who was always “‘going off’ unexpectedly, like an old rusty firearm,” would probably be sickly as well as unsuccessful—sallow, sunken-chested, afflicted by languorous indecision (which was how Victorian health literature depicted the typical victim of what was called “self-abuse”).

But as this profile of the chronic masturbator suggests, scarcity psychology had implications beyond familiar admonitions to sexual restraint. Sexual scarcity was part of a broader psychology of scarcity; the need to conserve semen was only the most insistently physical part of a much more capacious need to conserve psychic energy. As Bates advised, the cultivation of “self-possession” allowed you to “keep your resources well in hand, and reserve your strength until the proper time arrives to exert it.” The implication was that there was only so much strength available to meet demanding circumstances and achieve success in life. The rhetoric of “self-possession” had financial as well as sexual connotations. To preserve a cool, unruffled presence of mind (to emulate Rockefeller, in effect) was one way to stay afloat on the storm surges of the business cycle.

The object of this exercise, at least for men, was personal autonomy—the ownership of one’s self. […]

It was one thing to lament excessive wants among the working class, who were supposed to be cultivating contentment with their lot, and quite another to find the same fault among the middle class, who were supposed to be improving themselves. The critique of middle-class desire posed potentially subversive questions about the dynamic of dissatisfaction at the core of market culture, about the very possibility of sustaining a stable sense of self in a society given over to perpetual jostling for personal advantage. The ruinous results of status-striving led advocates of economic thrift to advocate psychic thrift as well.

By the 1880s, the need to conserve scarce psychic resources was a commonly voiced priority among the educated and affluent. Beard’s American Nervousness had identified “the chief and primary cause” of neurasthenia as “modern civilization,” which placed unprecedented demands on limited emotional energy. “Neurasthenia” and “nervous prostration” became catchall terms for a constellation of symptoms that today would be characterized as signs of chronic depression—anxiety, irritability, nameless fears, listlessness, loss of will. In a Protestant culture, where effective exercise of will was the key to individual selfhood, the neurasthenic was a kind of anti-self—at best a walking shadow, at worst a bedridden invalid unable to make the most trivial choices or decisions. Beard and his colleagues—neurologists, psychiatrists, and self-help writers in the popular press—all agreed that nervous prostration was the price of progress, a signal that the psychic circuitry of “brain workers” was overloaded by the demands of “modern civilization.”

While some diagnoses of this disease deployed electrical metaphors, the more common idiom was economic. Popular psychology, like popular economics, was based on assumptions of scarcity: there was only so much emotional energy (and only so much money) to go around. The most prudent strategy was the husbanding of one’s resources as a hedge against bankruptcy and breakdown. […]

Being reborn through a self-allowed regime of lassitude was idiosyncratic, though important as a limiting case. Few Americans had the leisure or the inclination to engage in this kind of Wordsworthian retreat. Most considered neurasthenia at best a temporary respite, at worst an ordeal. They strained, if ambivalently, to be back in harness.

The manic-depressive psychology of the business class mimicked the lurching ups and downs of the business cycle. In both cases, assumptions of scarcity underwrote a pervasive defensiveness, a circle-the-wagons mentality. This was the attitude that lay behind the “rest cure” devised by the psychiatrist Silas Weir Mitchell, who proposed to “fatten” and “redden” the (usually female) patient by isolating her from all mental and social stimulation. (This nearly drove the writer Charlotte Perkins Gilman crazy, and inspired her story “The Yellow Wallpaper.”) It was also the attitude that lay behind the fiscal conservatism of the “sound-money men” on Wall Street and in Washington—the bankers and bondholders who wanted to restrict the money supply by tying it to the gold standard. Among the middle and upper classes, psyche and economy alike were haunted by the common specter of scarcity. But there were many Americans for whom scarcity was a more palpable threat.

AT THE BOTTOM of the heap were the urban poor. To middle-class observers they seemed little more than a squalid mass jammed into tenements that were festering hives of “relapsing fever,” a strange malady that left its survivors depleted of strength and unable to work. The disease was “the most efficient recruiting officer pauperism ever had,” said a journalist investigating tenement life in the 1870s. Studies of “the nether side of New York” had been appearing for decades, but—in the young United States at least—never before the Gilded Age had the story of Dives and Lazarus been so dramatically played out, never before had wealth been so flagrant, or poverty been so widespread and so unavoidably appalling. The army of thin young “sewing-girls” trooping off in the icy dawn to sweatshops all over Manhattan, the legions of skilled mechanics forced by high New York rents to huddle with their families amid a crowd of lowlifes, left without even a pretense of privacy in noisome tenements that made a mockery of the Victorian cult of home—these populations began to weigh on the bourgeois imagination, creating concrete images of the worthy, working poor.

pp. 99-110

Racial animosities flared in an atmosphere of multicultural fluidity, economic scarcity, and sexual rivalry. Attitudes arising from visceral hostility acquired a veneer of scientific objectivity. Race theory was nothing new, but in the late nineteenth century it mutated into multiple forms, many of them characterized by manic urgency, sexual hysteria, and biological determinism. Taxonomists had been trying to arrange various peoples in accordance with skull shape and brain size for decades; popularized notions of natural selection accelerated the taxonomic project, investing it more deeply in anatomical details. The superiority of the Anglo-Saxon—according to John Fiske, the leading pop-evolutionary thinker—arose not only from the huge size of his brain, but also from the depth of its furrows and the plenitude of its creases. The most exalted mental events had humble somatic origins. Mind was embedded in body, and both could be passed on to the next generation.

The year 1877 marked a crucial development in this hereditarian synthesis: in that year, Richard Dugdale published the results of his investigation into the Juke family, a dull-witted crew that had produced more than its share of criminals and mental defectives. While he allowed for the influence of environment, Dugdale emphasized the importance of inherited traits in the Juke family. If mental and emotional traits could be inherited along with physical ones, then why couldn’t superior people be bred like superior dogs or horses? The dream of creating a science of eugenics, dedicated to improving and eventually even perfecting human beings, fired the reform imagination for decades. Eugenics was a kind of secular millennialism, a vision of a society where biological engineering complemented social engineering to create a managerial utopia. The intellectual respectability of eugenics, which lasted until the 1930s, when it became associated with Nazism, underscores the centrality of racialist thinking among Americans who considered themselves enlightened and progressive. Here as elsewhere, racism and modernity were twinned.

Consciousness of race increasingly pervaded American culture in the Gilded Age. Even a worldview as supple as Henry James’s revealed its moorings in conventional racial categories when, in The American (1877), James presented his protagonist, Christopher Newman, as a quintessential Anglo-Saxon but with echoes of the noble Red Man, with the same classical posture and physiognomy. There was an emerging kinship between these two groups of claimants to the title “first Americans.” The iconic American, from this view, was a blend of Anglo-Saxon refinement and native vigor. While James only hints at this, in less than a generation such younger novelists as Frank Norris and Jack London would openly celebrate the rude vitality of the contemporary Anglo-Saxon, proud descendant of the “white savages” who subdued a continent. It should come as no surprise that their heroes were always emphatically male. The rhetoric of race merged with a broader agenda of masculine revitalization.[…]

By the 1880s, muscular Christians were sweeping across the land, seeking to meld spiritual and physical renewal, establishing institutions like the Young Men’s Christian Association. The YMCA provided prayer meetings and Bible study to earnest young men with spiritual seekers’ yearnings, gyms and swimming pools to pasty young men with office workers’ midriffs. Sometimes they were the same young men. More than any other organization, the YMCA aimed to promote the symmetry of character embodied in the phrase “body, mind, spirit”—which a Y executive named Luther Gulick plucked from Deuteronomy and made the motto of the organization. The key to the Y’s appeal, a Harper’s contributor wrote in 1882, was the “overmastering conviction” of its members: “The world always respects manliness, even when it is not convinced [by theological argument]; and if the organizations did not sponsor that quality in young men, they would be entitled to no respect.” In the YMCA, manliness was officially joined to a larger agenda.

For many American Protestants, the pursuit of physical fitness merged with an encompassing vision of moral and cultural revitalization—one based on the reassertion of Protestant self-control against the threats posed to it by immigrant masses and mass-marketed temptation. […]

Science and religion seemed to point in the same direction: Progress and Providence were one.

Yet the synthesis remained precarious. Physical prowess, the basis of national supremacy, could not be taken for granted. Strong acknowledged in passing that Anglo-Saxons could be “devitalized by alcohol and tobacco.” Racial superiority could be undone by degenerate habits. Even the most triumphalist tracts contained an undercurrent of anxiety, rooted in the fear of flab. The new stress on the physical basis of identity began subtly to undermine the Protestant synthesis, to reinforce the suspicion that religion was a refuge for effeminate weaklings. The question inevitably arose, in some men’s minds: What if the YMCA and muscular Christianity were not enough to revitalize tired businessmen and college boys?

Under pressure from proliferating ideas of racial “fitness,” models of manhood became more secular. Despite the efforts of muscular Christians to reunite body and soul, the ideal man emerging among all classes by the 1890s was tougher and less introspective than his mid-Victorian predecessors. He was also less religious. Among advocates of revitalization, words like “Energy” and “Force” began to dominate discussion—often capitalized, often uncoupled from any larger frameworks of moral or spiritual meaning, and often combined with racist assumptions. […]

The emerging worship of force raised disturbing issues. Conventional morality took a backseat to the celebration of savage strength. After 1900, in the work of a pop-Nietzschean like Jack London, even criminality became a sign of racial vitality: as one of his characters says, “We whites have been land-robbers and sea-robbers from remotest time. It is in our blood, I guess, and we can’t get away from it.” This reversal of norms did not directly challenge racial hierarchies, but the assumptions behind it led toward disturbing questions. If physical prowess was the mark of racial superiority, what was one to make of the magnificent specimens of manhood produced by allegedly inferior races? Could it be that desk-bound Anglo-Saxons required an infusion of barbarian blood (or at least the “barbarian virtues” recommended by Theodore Roosevelt)? Behind these questions lay a primitivist model of regeneration, to be accomplished by incorporating the vitality of the vanquished, dark-skinned other. The question was how to do that and maintain racial purity.

pp. 135-138

Yet to emphasize the gap between country and the city was not simply an evasive exercise: dreams of bucolic stillness or urban energy stemmed from motives more complex than mere escapist sentiment. City and country were mother lodes of metaphor, sources for making sense of the urban-industrial revolution that was transforming the American countryside and creating a deep sense of discontinuity in many Americans’ lives during the decades after the Civil War. If the city epitomized the attraction of the future, the country embodied the pull of the past. For all those who had moved to town in search of excitement or opportunity, rural life was ineluctably associated with childhood and memory. The contrast between country and city was about personal experience as well as political economy. […]

REVERENCE FOR THE man of the soil was rooted in the republican tradition. In his Notes on the State of Virginia (1785), Jefferson articulated the antithesis that became central to agrarian politics (and to the producerist worldview in general)—the contrast between rural producers and urban parasites. “Those who labour in the earth are the chosen people of God, if ever he had a chosen people, whose breasts he has made his peculiar deposit for substantial and genuine virtue,” he announced. “Corruption of morals in the mass of cultivators is a phenomenon of which no age nor nation has furnished an example. It is the mark set on those, who not looking up to heaven, to their own soil and industry, as does the husbandman, for their subsistence, depend for it on the casualties and caprice of customers. Dependence begets subservience and venality, suffocates the germ of virtue, and prepares fit tools for the design of ambition.” Small wonder, from this view, that urban centers of commerce seemed to menace the public good. “The mobs of great cities,” Jefferson concluded, “add just so much to the support of pure government as sores do to the strength of the human body.” Jefferson’s invidious distinctions echoed through the nineteenth century, fueling the moral passion of agrarian rebels. Watson, among many, considered himself a Jeffersonian.

There were fundamental contradictions embedded in Jefferson’s conceptions of an independent yeomanry. Outside certain remote areas in New England, most American farmers were not self-sufficient in the nineteenth century—nor did they want to be. Many were eager participants in the agricultural market economy, animated by a restless, entrepreneurial spirit. Indeed, Jefferson’s own expansionist policies, especially the Louisiana Purchase, encouraged centrifugal movement as much as permanent settlement. “What developed in America,” the historian Richard Hofstadter wrote, “was an agricultural society whose real attachment was not to the land but to land values.” The figure of the independent yeoman, furnishing enough food for himself and his family, participating in the public life of a secure community—this icon embodied longings for stability amid a maelstrom of migration.

Often the longings were tinged with a melancholy sense of loss. […] For those with Jeffersonian sympathies, abandoned farms were disturbing evidence of cultural decline. As a North American Review contributor wrote in 1888: “Once let the human race be cut off from personal contact with the soil, once let the conventionalities and artificial restrictions of so-called civilization interfere with the healthful simplicity of nature, and decay is certain.” Romantic nature-worship had flourished fitfully among intellectuals since Emerson had become a transparent eye-ball on the Concord common and Whitman had loafed among leaves of grass. By the post–Civil War decades, romantic sentiment combined with republican tradition to foster forebodings. Migration from country to city, from this view, was a symptom of disease in the body politic. Yet the migration continued. Indeed, nostalgia for rural roots was itself a product of rootlessness. A restless spirit, born of necessity and desire, spun Americans off in many directions—but mainly westward. The vision of a stable yeomanry was undercut by the prevalence of the westering pioneer.

pp. 246-247

Whether energy came from within or without, it was as limitless as electricity apparently was. The obstacles to access were not material—class barriers or economic deprivation were never mentioned by devotees of abundance psychology—they were mental and emotional. The most debilitating emotion was fear, which cropped up constantly as the core problem in diagnoses of neurasthenia. The preoccupation with freeing oneself from internal constraints undermined the older, static ideal of economic self-control at its psychological base. As one observer noted in 1902: “The root cause of thrift, which we all admire and preach because it is so convenient to the community, is fear, fear of future want; and that fear, we are convinced, when indulged overmuch by pessimist minds is the most frequent cause of miserliness….” Freedom from fear meant freedom to consume.

And consumption began at the dinner table. Woods Hutchinson claimed in 1913 that the new enthusiasm for calories was entirely appropriate to a mobile, democratic society. The old “stagnation” theory of diet merely sought to maintain the level of health and vigor; it was a diet for slaves or serfs, for people who were not supposed to rise above their station. “The new diet theory is based on the idea of progress, of continuous improvement, of never resting satisfied with things as they are,” Hutchinson wrote. “No diet is too liberal or expensive that will…yield good returns on the investment.” Economic metaphors for health began to focus on growth and process rather than stability, on consumption and investment rather than savings.

As abundance psychology spread, a new atmosphere of dynamism enveloped old prescriptions for success. After the turn of the century, money was less often seen as an inert commodity, to be gradually accumulated and tended to steady growth; and more often seen as a fluid and dynamic force. To Americans enraptured by the strenuous life, energy became an end itself—and money was a kind of energy. Success mythology reflected this subtle change. In the magazine hagiographies of business titans—as well as in the fiction of writers like Dreiser and Norris—the key to success frequently became a mastery of Force (as those novelists always capitalized it), of raw power. Norris’s The Pit (1903) was a paean to the furious economic energies concentrated in Chicago. “It was Empire, the restless subjugation of all this central world of the lakes and prairies. Here, mid-most in the land, beat the Heart of the nation, whence inevitably must come its immeasurable power, its infinite, inexhaustible vitality. Here of all her cities, throbbed the true life—the true power and spirit of America: gigantic, crude, with the crudity of youth, disdaining rivalry; sane and healthy and vigorous; brutal in its ambition, arrogant in the new-found knowledge of its giant strength, prodigal of its wealth, infinite in its desires.” This was the vitalist vision at its most breathless and jejune, the literary equivalent of Theodore Roosevelt’s adolescent antics.

The new emphasis on capital as Force translated the psychology of abundance into economic terms. The economist who did the most to popularize this translation was Simon Nelson Patten, whose The New Basis of Civilization (1907) argued that the United States had passed from an “era of scarcity” to an “era of abundance” characterized by the unprecedented availability of mass-produced goods. His argument was based on the confident assumption that human beings had learned to control the weather. “The Secretary of Agriculture recently declared that serious crop failures will occur no more,” Patten wrote. “Stable, progressive farming controls the terror, disorder, and devastation of earlier times. A new agriculture means a new civilization.” Visions of perpetual growth were in the air, promising both stability and dynamism.

The economist Edward Atkinson pointed the way to a new synthesis with a hymn to “mental energy” in the Popular Science Monthly. Like other forms of energy, it was limitless. “If…there is no conceivable limit to the power of mind over matter or to the number of conversions of force that can be developed,” he wrote, “it follows that pauperism is due to want of mental energy, not of material resources.” Redistribution of wealth was not on the agenda; positive thinking was.

pp. 282-283

TR’s policies were primarily designed to protect American corporations’ access to raw materials, investment opportunities, and sometimes markets. The timing was appropriate. In the wake of the merger wave of 1897–1903, Wall Street generated new pools of capital, while Washington provided new places to invest it. Speculative excitement seized many among the middle and upper classes who began buying stocks for the first time. Prosperity spread even among the working classes, leading Simon Nelson Patten to detect a seismic shift from an era of scarcity to an era of abundance. For him, a well-paid working population committed to ever-expanding consumption would create what he called The New Basis of Civilization (1907).

Patten understood that the mountains of newly available goods were in part the spoils of empire, but he dissolved imperial power relations in a rhetoric of technological determinism. The new abundance, he argued, depended not only on the conquest of weather but also on the annihilation of time and space—a fast, efficient distribution system that provided Americans with the most varied diet in the world, transforming what had once been luxuries into staples of even the working man’s diet. “Rapid distribution of food carries civilization with it, and the prosperity that gives us a Panama canal with which to reach untouched tropic riches is a distinctive laborer’s resource, ranking with refrigerated express and quick freight carriage.” The specific moves that led to the seizure of the Canal Zone evaporated in the abstract “prosperity that gives us a Panama Canal,” which in turn became as much a boon to the workingman as innovative transportation. Empire was everywhere, in Patten’s formulation, and yet nowhere in sight.

What Patten implied (rather than stated overtly) was that imperialism underwrote expanding mass consumption, raising standards of living for ordinary folk. “Tropic riches” became cheap foods for the masses. The once-exotic banana was now sold from pushcarts for 6 cents a dozen, “a permanent addition to the laborer’s fund of goods.” The same was true of “sugar, which years ago was too expensive to be lavishly consumed by the well-to-do,” but “now freely gives its heat to the workingman,” as Patten wrote. “The demand that will follow the developing taste for it can be met by the vast quantities latent in Porto Rico and Cuba, and beyond them by the teeming lands of South America, and beyond them by the virgin tropics of another hemisphere.” From this view, the relation between empire and consumption was reciprocal: if imperial policies helped stimulate consumer demand, consumer demand in turn promoted imperial expansion. A society committed to ever-higher levels of mass-produced abundance required empire to be a way of life.