A Century of Obesity Epidemic

Why We Get Fat
by Gary Taubes

In 1934, a young German pediatrician named Hilde Bruch moved to America, settled in New York City, and was “startled,” as she later wrote, by the number of fat children she saw—“ really fat ones, not only in clinics, but on the streets and subways, and in schools.” Indeed, fat children in New York were so conspicuous that other European immigrants would ask Bruch about it, assuming that she would have an answer. What is the matter with American children? they would ask. Why are they so bloated and blown up? Many would say they’d never seen so many children in such a state.

Today we hear such questions all the time, or we ask them ourselves, with the continual reminders that we are in the midst of an epidemic of obesity (as is the entire developed world). Similar questions are asked about fat adults. Why are they so bloated and blown up? Or you might ask yourself: Why am I?

But this was New York City in the mid- 1930s. This was two decades before the first Kentucky Fried Chicken and McDonald’s franchises, when fast food as we know it today was born. This was half a century before supersizing and high- fructose corn syrup. More to the point, 1934 was the depths of the Great Depression, an era of soup kitchens, bread lines, and unprecedented unemployment. One in every four workers in the United States was unemployed. Six out of every ten Americans were living in poverty. In New York City, where Bruch and her fellow immigrants were astonished by the adiposity of the local children, one in four children were said to be malnourished. How could this be?

Fat in the Fifties
by Nicolas Rasmussen

Obesity burst into the public consciousness in the years immediately following the Second World War. Around 1950, the US Public Health Service (PHS) issued a brochure on “the greatest problem in preventive medicine in the USA”: obesity. The life insurance industry, working in collaboration with the PHS and the American Medical Association (AMA), launched a national drive, proclaiming “Overweight: America’s No. 1 Health Problem.” And no wonder, given that insurance company data and some local health surveys suggested that more than a quarter of the American population was significantly overweight or obese. By the typical measure of the day, anyone 10 percent above the “ideal weight” for a given height fell into the category of overweight—the ideal weight being that which the insurance industry found to predict maximum longevity. Those 20 percent overweight were classified as obese. The danger of excess weight was grave, because it was the leading predictor of heart disease, the nation’s top killer. […]

Stroke, cancer, and, most of all, heart disease leaped to the forefront as causes of death.20 By 1920 heart disease had taken the lead as the top cause of death; by the end of the decade, based mainly on evidence developed by Dublin and other insurance industry statisticians, health policy analysts came to believe that heart disease was also catching up with tuberculosis in terms of its total financial burden on the nation (despite the fact that heart disease tended to kill its victims later in their wage-earning years). Imposing double the economic burden of cancer, which would soon become the second greatest cause of death, heart disease had unquestionably become Public Health Enemy Number 1 by 1930. […] The [early 20th century] findings indicated a clear association between overweight and excess mortality. […] In 1930, Louis Dublin used this type of information as the basis for a groundbreaking actuarial study that specifically correlated overweight with heart disease.

Carbohydrates, Essential Nutrients, and Official Dietary Guidelines

“You’ll be reassured to know that you don’t have to eat carbohydrates to live. It’s not an essential nutrient.
“It’s one of the first things we learn in nutrition is what does the body not make and what you HAVE to eat.
“You won’t find carbohydrate on this list.”
~Eric Westman, There’s no such thing as an essential carbohydrate

“Carbohydrates are not essential nutrients.”
~Denise R. Ferrier, Biochemistry

“Carbohydrates are not essential nutrients.”
~Simon W. Walker, Peter Rae, Peter Ashby, & Geoffrey Beckett, Clinical Biochemistry

“Carbohydrates are not considered essential.”
~Carie Ann Braun & Cindy Miller Anderson, Pathophysiology: Functional Alterations in Human Health

“No specific carbohydrates have been identified as dietary requirements.”
~Michael Lieberman, Allan D. Marks, & Alisa Peet , Marks’ Basic Medical Biochemistry: A Clinical Approach

“In the absence of dietary carbohydrate, the body is able to synthesize glucose from lactic acid, certain amino acids and glycerol via gluconeogenesis.”
~Jim Mann & A. Stewart Truswell, Essentials of Human Nutrition

“Even when a person is completely fasting (religious reasons, medically supervised, etc.) the 130 g / day of glucose needed by the brain is made from endogenous protein and fat.
“When people are “fasting” the 12 hour period from the end of supper the night before until breakfast (“break the fast”) the next day, their brain is supplied with essential glucose! Otherwise, sleeping could be dangerous.”
~Joy Kiddie, How Much Carbohydrate is Essential in the Diet?

Dietary Reference Intakes for Energy, Carbohydrate, Fiber, Fat, Fatty Acids, Cholesterol, Protein, and Amino Acids
from National Academies of Sciences, Engineering, and Medicine
published by Institutes of Medicine
2005 textbook of the US Food and Nutrition Board

The lower limit of dietary carbohydrate compatible with life apparently is zero, provided that adequate amounts of protein and fat are consumed. However, the amount of dietary carbohydrate that provides for optimal health in humans is unknown. There are traditional populations that ingested a high fat, high protein diet containing only a minimal amount of carbohydrate for extended periods of time (Masai), and in some cases for a lifetime after infancy (Alaska and Greenland Natives, Inuits, and Pampas indigenous people) (Du Bois, 1928; Heinbecker, 1928). There was no apparent effect on health or longevity. Caucasians eating an essentially carbohydrate-free diet, resembling that of Greenland natives for a year tolerated the diet quite well. However, a detailed modern comparison with populations ingesting the majority of food energy as carbohydrate has never been done.

Why Won’t We Tell Diabetics the Truth?
by Diana Rodgers

They base the carbohydrate requirement of 87g-112 grams per day on the amount of glucose needed to avoid ketosis. They arrived at the number 100g/day to be “the amount sufficient to fuel the central nervous system without having to rely on a partial replacement of glucose by ketoacid,” and then they later say that “it should be recognized that the brain can still receive enough glucose from the metabolism of the glycerol component of fat and from the gluconeogenic amino acids in protein when a very low carbohydrate diet is consumed.” (Meaning, ketosis is NO BIG DEALIn fact, it’s actually a good thing and is not the same as diabetic ketoacidosis that type 1 diabetics and insulin dependent type 2 diabetics can get.) The RDA of 130g/day was computed by using a CV of 15% based on the variation in brain glucose utilization and doubling it, therefore the the RDA (recommended daily allowance) for carbohydrate is 130% of the EAR (estimated average requirement).

Added sugars drive nutrient and energy deficit in obesity: a new paradigm
by James J DiNicolantonio and Amy Berger

Mankind has survived without isolated, refined sugar for almost 2.6 million years.48 The body—in particular, the brain—has been thought to require upwards of 200 g of glucose per day, leading to the often cited dogma that glucose is ‘essential for life’.1 While it is true that glucose is essential for sustaining life, there is no requirement for dietary glucose, as fatty acids can be turned into brain-fuelling ketone bodies, and amino acids and glycerol are gluconeogenic substrates.49 Indeed, in the relative absence of dietary glucose, ketone bodies may supply upwards of 75% of the brain’s required energy, with the remainder supplied by gluconeogenesis provided by amino acids (from dietary protein or catabolism of body proteins) and from glycerol (provided by the breakdown of triglycerides in adipose tissue).33 Thus, exogenous glucose (eg, from added sugars) is not essential for sustaining life in humans, and in most people, restricting dietary carbohydrates seems to produce no ill effects.49 In fact, according to the Food and Nutrition Board of the Institute of Medicine of the US National Academies of Sciences, ‘The lower limit of dietary carbohydrate compatible with life apparently is zero, provided that adequate amounts of protein and fat are consumed’.50

Administration of fructose or sucrose in humans has been shown to cause each of the abnormalities that define the metabolic syndrome (eg, elevated triglycerides, low high-density lipoprotein, insulin resistance, glucose intolerance, elevated blood glucose, elevated blood pressure and weight gain (specifically around the abdomen)),30 51–55 as well as features found in patients with coronary heart disease (eg, increased platelet adhesiveness and hyperinsulinaemia),56 57 all of which can be reversed entirely upon reverting to a diet low in sugar.47 52 56 58–60 Consumption of added sugars at current levels of intake is proposed as a contributing factor in a multitude of other diseases associated with early mortality, such as cardiometabolic disease,61–64 obesity,30 61 65–68 β-cell dysfunction and type 2 diabetes,6 20 69–71 hypertension,51 64 72 non-alcoholic fatty liver7 and atherosclerosis.6 73 74 Because of this, added sugars cannot be considered food.

What to Eat: The Ten Things You Really Need to Know to Eat Well and Be Healthy
by Luise Light, pp. 18-21, 

The alterations that were made to the new guide would be disastrous, I told my boss, the agency director. These changes would undermine the nutritional quality of eating patterns and increase risks for obesity and diabetes, among other diseases. No one needs that much bread and cereal in a day unless they are longshoremen or football players, and it would be unhealthy for the rest of us, especially people who are sedentary or genetically prone to obesity and diabetes. […]

At stake here, I told him, was nothing short of the credibility and integrity of the USDA as a source of reliable nutrition information. Over my objections, the alterations were included and the guide was finalized. I was told this was done in order to keep the lid on the costs of the food stamp program. Fruits and vegetables were expensive, much more expensive than breads and cereals, and the added servings of grains would, to some extent, offset the loss of nutrients from fruits and vegetables, the head of our division told me. However, the logic of that rationale escaped me.

Refined wheat products are what we called in the nutrition trade “cheap carbos,” stomach-filling food preferred when other, higher quality foods are unavailable or not affordable. They do little—if anything—to boost the nutritional quality of people’s diets and tend to add not only starch, but also fat and sugar to the diet. It was curious that there had been no discussion of the cost constraints of the food stamp program in any previous discussion over the many months we had been working on the guide. Intuitively, I knew I was being “played,” but other than stalling and requesting additional outside reviews I felt stymied.

Later, I remembered a Pan American Health Organization (PAHO) nutrition survey I had participated in during graduate school. One of our findings was a high rate of obesity among women in a particular region of the Caribbean country we were working in that had the lowest employment and per capita income. It puzzled me that the poorest region would have the most obese people until one of the physicians on our team explained that the prevalence of obesity was consistent with what he called an “impoverished diet,” too little nutritious food that caused people to feel hungry all the time, and with only cheap carbohydrates available to them, their hunger was never appeased, so they ate and ate and became fatter and fatter.

Was this inflated grain recommendation, I wondered, setting us up for a third world obesity scenario in our own country? Historically, the food guide was used to calculate the cost basis of the food stamps program. Did that mean we needed to develop two different sets of standards for nutrition, one for poor people and another for those better off, or did it mean that what was affordable in the food stamps program would determine what was best for the rest of us? Neither of these Hobson’s choices could be justified on scientific or ethical grounds. The changes that were made to the guide meant that any food product containing wheat flour, from white bread, Twinkies, Oreos, and bagels to pop toasters and Reese’s Puffs, would be considered nutritionally equivalent, which was not the case.

With my protests falling on deaf ears, the serving suggestions in the revised guide were incorporated into the regulations for the food stamps program, as well as the school breakfast and lunch, day care, and all other feeding programs administered by the USDA. Later, Congress set the serving amounts into legislative “stone” so it would be against the law not to serve the expanded number of grain servings that were in the new guide, a change that meant a financial windfall for the wheat industry. The new rules for school lunch programs increased the amount of bread and cereal products purchased for the program by 80 percent. For children in grades K through six, it meant eight daily servings of breads, cereals, and pasta, and for grades seven through twelve, ten servings.

For wheat growers, this meant an increase of 15 million bushels of wheat sold annually worth about $50 million and a retail sales boost of $350 million from additional sales of cereals, breads, and snacks. That didn’t include the extra sales resulting from the government subsidized food stamps program or revenues from the industry’s own efforts to shift public consumption toward more bread, pasta, and baked goods because of the new recommendations. Throughout the nineties, Americans increased their consumption of refined grain products from record lows in the 1970s to the six to eleven servings suggested in the new guide.

* * *

Partial credit for some of the quoted material goes to Bill Murrin, from comments he left at the article Dietary guidelines shouldn’t be this controversial; published at Marion Nestle’s website, Food Politics.

Taurine is a common, if often unrecognized, deficiency.

Taurine is not technically an essential nutrient, but many argue it should be labeled as such (see Harry Serpanos). It’s not unusual for people, specifically as they age, to not endogenously produce enough. As an osmolyte, taurine is one of the master regulator’s of the body. The health problems caused by deficiency of it are numerous because the purposes it serves are numerous.

One of the main areas taurine is involved in is digestion. It ensures proper pH levels for protein digestion, proper bile availability for fat digestion, and such. Another main areas is in homeostatically maintaining mineral levels, from iron to the electrolytes (sodium, potassium, and magnesium); as related to it also regulating fluids.

The last function helps explain part of what has gone wrong on the Standard American Diet (SAD). When carbohydrate intake is high, insulin is constantly being spiked. This causes fluid retention and hence excess electrolytes. This is why it’s generally recommended to lower sodium intake, as it increases blood pressure.

However, this is only a problem on a high-carb diet. Go to the opposite extreme of a keto diet, there is the opposite extreme of a problem. Without constant insulin response, the body excretes unnecessary water from the cells. That would be fine by itself, but it ends up also excreting the electrolytes in the process.

Keto dieters don’t have to worry about high blood pressure, even if they were heavily salting their food. The body will simply keep on eliminating it. The issue with that is something else entirely. Low electrolyte levels can cause havoc in the body: cramps, tiredness, hormonal imbalances, blood clotting impairment, etc.

Of course, this is simple to solve. Many people in regular ketosis just supplement electrolytes and then they feel perfectly fine. But why do they need to supplement? Hunter-gatherers don’t supplement. The thing is the official keto diet, as originally used for medical purposes, restricts protein for concern of gluconeogenesis (i.e., conversion to glucose; the reason one doesn’t need to eat carbs).

It is true that a large bolus of protein — as a large meal of meat, fish eggs, soy, seitan, etc — will boost insulin and knock one out of ketosis. It only does this briefly, as opposed to what happens on a high-carb diet, but those seeking ketosis for health reasons want to maintain it constantly. There are medical conditions, such as epileptic seizures, where this is necessary.

For most people, though, they don’t need to be in constant ketosis. Restricting protein inevitably means restricting taurine in the diet. That potentially can make it harder for the body even to make use of the protein that is consumed, which can cause one to not get enough anabolic growth, repair, and healing; such as not being able to build muscle.

Such a problem isn’t limited to keto dieters, of course. The average American only gets around 12% of their calories from protein, as opposed to something like 40% of calories from seed oils, the latter being bane of the alternative diet world. We’ve been told by health experts to reduce meat intake and most Americans have complied. So, down goes taurine levels in the general public.

There are still other complications for why taurine can be hard to get, despite theoretically being so plentiful in certain animal foods. First off, the highest sources of taurine is seafood, not something most Americans eat all the time. Even American beef consumption dropped quite a bit over the past century, if recently there has been a slight uptick.

Though ruminant meat is the second great source of taurine, there are two factors that can reduce the content in the meat that ends up on plates and between buns. Taurine is found in the liquid. Beef is often hung in a storage locker for months, sometimes a year and a half. This is the tender aged beef that we prefer, as we evolved to be scavengers.

As such, most of the meat we buy has already lost it’s supply of taurine before we even get it home. Then we are likely to overcook it and hence even more of the taurine-filled juices drip away. Few people catch the juices and consume them. That is easy to do with a slow-cooker, and you will notice the tremendous amount of liquid that sometimes comes out.

If one is to grill a steak, make sure to sear it at high temperatures on both sides. That will seal in the juices. Hamburgers are more problematic. The beef could’ve been ground much earlier and there is nothing to hold in the taurine. One solution is, if you have a butcher nearby, have them freshly grind up beef when you need it.

This knowledge is typically moot on a traditional diet, in particular among hunter-gatherers, since taurine is found in animal foods. They possibly are getting plenty of fish or at least plenty of fresh meat, often from ruminants. Dairy and eggs also have a fair amount of taurine, if not as high.

A related topic is the sodium issue for different populations. On a taurine-rich and/or low-carb diet, over-salting one’s food is a non-issue. Nonetheless, it’s interesting that, when looking at hunter-gatherers like the Hadza, it appears they don’t use salt (Ancestry Foundation, L. Amber O’hearn – Blood, sweat, and tears: how much salt do we really need? (AHS22)). The thing is there actually are plenty of minerals, including sodium, in animal foods.

If taurine is sufficient, as would be the case for the Hadza and others, the body will hold onto what minerals it gets. With homoeostatic regulation, there will be no problem of excess sodium nor deficient electrolytes. Just eat fish and fresh meat. Then you probably will be fine in this area of health.

There are additional explanations for why this is the case. There are two things the body needs extra salt for. One is to balance out potassium. And the other is to eliminate toxins. Both potassium and toxins are more often found in plant foods. Hunter-gatherers solve this problem by prioritizing animal foods, when possible.

Hunter-gatherers can seem amazing in how they have managed to solve health problems like this with no scientific knowledge. That is because they didn’t actually solve the problem. They simply prevented it in the first place by eating as hominids have done for millions of years.

Yet to the modern perspective, it sometimes can seem amazing. It’s not only that hunter-gatherers seemingly don’t bother much with salt. The nutritionist Mary Ruddick, in talks with Harry Serpanos, discussed her time spent with the Hadza. She observed they drank very little water.

On persistence hunts lasting hours in the heat of the midday, they’d carry no water and would not stop for water. They wouldn’t even take a break to get some honey from the hives they kept passing. All they wanted was the meat. Serpanos noted, in another video, that Inuit will drink the taurine-filled fluids from a fresh kill.

Those fluids, of course, contain water. And the taurine would help with maintaining low levels of thirst in keeping everything in balance. But also the blood would be low in deuterium, as the animals already eliminated it. The more deuterium one gets the more one needs water to eliminate the deuterium (see Harry Serpanos). That means less thirst and less need for water.

Serpanos suggested that this is why the Hadza will expend such effort in digging up, cooking, and chewing on tough, fibrous wild tubers that lack much in the way of nutrition, not even carbs. What they might be seeking is the deuterium-depleted water that is made available. This might be the same reason they’ll suck on certain kinds of leaves.

For all these reasons, hunter-gatherers could accomplish physical feats that seem impossible to an outside observer. Consider the Apache, on foot, who could outpace the United States cavalry while carrying no water or food, sometimes while crossing deserts and dry grasslands. Part of this is from being in ketosis that burns body fat for energy. Ketones are a superfuel.

The other thing is that the body will produce metabolic water from burning fat as well. And guess what? Metabolic water is deuterium-depleted. So, on a diet that is very low-carb or includes plenty of fasting, humans will be fat-adapted in allowing easy access to energy and water as needed, just as long as the body has a fat reserve.

Also, as long as the diet is animal-based, the necessary minerals such as electrolytes will be maintained. Unlike a modern athlete guzzling carbs non-stop, the hunter-gatherer can easily go on for hours with no intake of food or water, much less carbs. It’s simply not necessary. Humans were evolved for persistence hunting and for going long periods in between meals (Human Adaptability and Health).

The moral of the story: Eat a species-appropriate diet. Or else make sure to carefully supplement and hope for the best.

Water Quality (in Iowa and Elsewhere)

What is the quality of Iowa’s drinking water? An individual might suspect that it’s quite low. It’s a farm state with one of the least regulated and most big ag friendly economies. Iowa has a larger population than any state of factory-farmed pigs and egg-laying hens, combined with the highest percentage of farmed land — that is to say a lot of runoff from CAFOs (concentrated animal feeding operations) and agriculture. Yet, besides the expected contaminants of nitrates and bacteria, Iowa’s water manages to maintain a national ranking that is in the middle of the pack, not great but not horrible. I guess that is a minor achievement. As Midwestern residents of Middle America, we are used to being average. Admittedly, in terms of personal observation, the taste and smell that comes out of the faucet has vastly improved since the present Iowa City water treatment plant was opened in 2003. Before that, the government used to warn the public to not drink the tap water if they were young, old, sick, or pregnant; and that advice specifically applied to the springtime when the farmers were dumping chemicals on their fields with most of it being washed into the waterways.

As an interesting side note, we have previously heard about a water spring in or near the local Hickory Hill Park that was a Native American campsite along the trail they used (a mysterious place we’ve never been able to locate). But apparently there also was the “Iowa City Mineral Spring works located on Iowa ave,” along which runs Ralston Creek that passes through the park (Daily Iowa State Press Newspaper; Apr 29 1901, Page 4). Actually, it was three springs at the Iowa Avenue property and they were written about as far back as 1841 (History of Johnson County, Iowa). The site was originally owned by Robert Lucas, the Governor of the Iowa Territory. The spring apparently didn’t have soothing qualities, as Gov. Lucas was known to have a temper and almost started a war over a boundary dispute with Missouri, what is called the Honey War. A later owner built over the springs in making it a health resort that didn’t succeed and so there presumably is still a spring in the basement of whichever house that is.

Springs aside, what is the public utility water like these days? In 2020, Iowa City tap water was reported as having exceeded all state and federal health standards in that there were zero violations of the Safe Drinking Water Act, for whatever that is worth. On the other hand, according to one website, 3rd party independent testing reports that Iowa City tap water exceeds health guidelines for multiple drinking water contaminants: Bromodichloromethane, Bromoform, Chloroform, Chromium (Hexavalent), Dibromochloromethane, Dichloroacetic Acid, Nitrate, Total Trihalomethanes (TTHMs), Trichloroacetic Acid, etc. Then again, that comes from the official website of a water filtration system retailer. Anyway, one way or another, Iowa City tap water is exceeding — they should give out an award for that.

As a longtime contentious issue, fluoride is added by the local water treatment plant which increases the biological uptake of lead, although lead levels in Iowa City are generally lower, unless you live in an old house in an old neighborhood. On a related note, fluoride is often contaminated with arsenic. In case you didn’t know, arsenic is really bad for your health and is found in much of Iowa’s well water, if for other reasons. One of the disturbing sources of this toxin comes from the decaying corpses that were buried before, a little over a century ago, arsenic was banned in embalming fluid. In Iowa City near a Civil War cemetery, arsenic levels tested three times the federal limit. By the way, the 2019 IC Water Quality Report didn’t test for arsenic. The last time they did test for arsenic apparently was in 2014 and they didn’t detect any — so, maybe there is no reason to worry on that account, assuming you’re not drinking well water near an old cemetery.

There are still other contaminants of concern. As of a few years ago (2019), it doesn’t seem the local government was testing for either PFAS (per- and polyfluoroalkyl substances; “forever chemicals”) or microplastics. The state government, however, was on the ball; at least on one account. The Iowa DNR did an analysis last year of various locations that included Iowa City. The local tap water didn’t contain any traces of most PFAS and, of the one detected, it was considered at a safe level. It’s still not clear about microplastics that are found in 83% of water globally. It is hard to even find any discussion about microplastics in the water supply of Iowa City and across the state, other than a single letter to the editor in a local alternative newspaper, the Little Village. Other unknown potential contaminants could include pharmaceuticals and no local info came up about that at all, which makes one wonder considering Iowa City is a major medical town with three large hospitals and numerous assisted living residences.

As the tapsafe website put it in an article about Iowa City, “While tap water that meets the EPA health guidelines generally won’t make you sick to your stomach, it can still contain regulated and unregulated contaminants present in trace amounts that could potentially cause health issues over the long-run. These trace contaminants may also impact immunocompromised and vulnerable individuals.” Some of what is unknown is how the diversity of contaminants might interact within the human body, whether in terms of being ingested at the same time or in terms of bioaccumulation over years and decades. Yet it’s not merely a lack of knowledge of the effect but in some cases, from microplastics to pharmaceuticals, it seems we don’t even have info on to what degree they are present — they simply aren’t even on the radar of news reporting, public debate, good governance, environmental regulations, and health guidelines. That is to say, as a society, we have failed to follow the precautionary principle.

This brings us to the possibility of getting one’s drinking water from another source. Besides water filtration systems, the one product that has stood out is Mountain Valley Spring Water, as it has a good reputation of quality and transparency, not to mention many local distributors in the United States who will deliver it to your home at no additional cost; and it’s also widely available in grocery stores. It supposedly is the “Official water of the White House since 1920. . . [when] they made their first delivery of Mountain Valley Water to the White House of President Woodrow Wilson on the advice of his personal physician. Since that period Mountain Valley has been served to generations of US presidents and in the halls of Congress.” And, having been on the market for a century and half, it’s long been touted as clean and healthy; which is supported by various testing of water quality and product safety, albeit it won’t heal all that ails you as advertising copy claimed earlier last century back when healing springs were all the craze. 

Consumer Reports had a whole slew of bottled waters tested in 2019, including Mountain Valley. Even though Mountain Valley’s colored glass bottles themselves contain undesirable elements (as is typical of colored glass), the testing seems to indicate none of it leaks into the water contained therein. Glass doesn’t break down or leach chemicals in the way does plastic and hence the microplastics problem — even many expensive high-quality waters in plastic bottles are filled with microplastics; hence plastic bottles as something to concerned about and to definitely avoid. Anyway, the local vendor provides Mountain Valley in the 5-gallon glass jars that appear to be clear, even if colored glass was an issue which it’s not. The only downside to this option is pricing, but then again a high quality water filtration system at thousands of dollars would take years to pay off before it would save you money in comparison to home-delivered spring water. One’s preference partly depends on how much money one has to invest upfront. Still, even the best systems like reverse osmosis don’t actually remove all contaminants and so wouldn’t give you a water as pure as an ancient spring source.

That brings us to another important point, how protected is the source. Besides the plastic issue, the main cause of contamination in bottled waters probably comes from the water source itself. Sadly, it’s not only groundwater, waterways, and wells that are contaminated but apparently also more than a few springs, such as where delicious and popular Topo Chico is procured from in Mexico (some improvements have been made, though far from meeting health guidelines). This might be because many springs contain water that only filtered through the ground for a few months or a few years, and as such they contain some of the contaminants from wherever the water originated. Mountain Valley, on the other hand, is from a spring with water that fell as rain three and half millennia ago during the Bronze Age, long before industrialization. Other springs are even more ancient. One of the worrisome contaminants are the abovementioned PFAS, the specific contaminant at high levels in Topo Chico spring water. Consumer Reports has listed brands according to their PFAS content and they have further discussion about specific brands. They didn’t mention Mountain Valley, but other testing hasn’t found these problematic chemicals.

The lesson for the day: Be careful about what you put into your body! Water is good for your health, until it is not. And quality can be hard to determine, in requiring that you do your due diligence — buyer beware, as they say. This is an area where regulatory bodies have largely failed the public or else been highly inconsistent and at times careless, possibly because of corporate lobbyist pressure (e.g., the lack of regulation of PFAS and the fact that some of the most contaminated products are owned by big biz and highly profitable, such as Coca Cola’s Topo Chico). Once inside you, no one entirely knows what all of these weird substances do to your delicate innards. Be kind to your innards and they will be kind to you. But fail to heed this warning and you will slowly rot from the inside out, as you writhe in agony while cursing the gods for the day you were born. It could happen. One way or another, drink the best water you can reasonably afford as a starting point of health, even if a charcoal filter is the only option in your price range.

Millennials Are Hitting Old Age In Their Thirties

There is a comedy sketch, This is Your Brain After Thirty, from the group It’s a Southern Thing. It is a parody of a pharmaceutical commercial. And the target audience is Millennials who are now feeling the evidence of growing older. The voiceover begins, “Are you in your 30s? You may not feel old. But you don’t exactly feel young, either.” Then it presents three characters with their symptoms:

  • Person 1: “Sometimes I walk into a room and completely forget what I walked in there for.”
  • Person 2: “I can’t remember my own phone number. And I’ve had the same number for ten years.”
  • Person 3: “I know I had supper last night. I clearly don’t skip meals. But for the life of me, I can’t remember what I ate.”

The voiceover continues with the official diagnosis. “Then you might be suffering from Thirties Brain.” There is nothing quite as comforting as having a label. That explains everything. That’s just what happens when one reaches old age in one’s thirties. Yeah, that’s completely normal. Don’t worry, though. “It’s not your fault,” reassures the voice of authority. More info is then offered about it:

“It’s a common condition that affects millions of people. People who are old enough to take their 401(k) seriously, but not quite old enough to enjoy eating at Golden Corral. It’s not your fault. Your brain is too full of useless knowledge, now. Why remember your own phone number, when you could retain every word of the 2001 hit “Drops of Jupiter” by Train? Thirties Brain can make even the most simple conversations feel exhausting. But as soon as it feels like you can think clearly again, your brain stops working again. If this sounds like you or someone you love, then ask your doctor about our new twice-a day…”

Of course, this is just comedy, but it’s funny for the very reason so many can relate to the experience. In becoming part of popular culture, it’s being normalized. That is rather sad when one thinks about it. Should we really be normalizing early onset neurocognitive decline? What they are now jokingly calling “Thirties Brain”, would not long ago have been called “Fifties Brain” or “Sixties Brain”. Indeed many serious health conditions like Alzheimer’s used to be entirely identified with old age and now are increasingly being diagnosed among the young (when we were kids, Alzheimer’s would sometimes be called Old Timer’s disease). The same is true of type II diabetes, which originally was called adult onset diabetes because adulthood was typically the age of diagnosis. These conditions are part of metabolic syndrome or metabolic dysfunction that involves insulin resistance as a key component.

Also common in metabolic syndrome is obesity. It instantly stood out that each actor in the parody commercial were all quite overweight to the point of being obese. Yet obesity also has been normalized, particularly in the South where it’s rampant. Obesity involves inflammation throughout the body, as inflammation is also seen in the brain with Alzheimer’s (along with depression, etc); and inflammation is related to autoimmune disorders, from multiple sclerosis to rheumatoid arthritis. Body fat is an organ, like the liver, spleen, or thyroid. And, in particular, body fat is key to the functioning of the hormone system. Hormones like insulin don’t only regulate appetite and glucose but also a number of other interlinked systems in the body. That is why metabolic syndrome can manifest as numerous health conditions and diseases. And that is why metabolic syndrome is the main comorbidity of COVID-19 and other infectious diseases.

If you’re experiencing “Thirties Brain”, you should take that as a serious symptom to be worried about. It’s an early sign of health decline that is only going to get worse, unless you change your diet and lifestyle. People typically have metabolic syndrome years or even decades before finally being diagnosed with a disease that doctors recognize, something like diabetes or cardiovascular disease. But it can often be easily reversed, particularly if caught early. Unfortunately, few Americans realize that this is a public health crisis and one that is entirely preventable. Many experts have predicted that healthcare costs are going to continue to skyrocket, as it eats up more of the national GDP and causes widespread medical debt.

This could end up an existential crisis for our society. That is what happened during the World War II draft. The United States Military suddenly realized so many young men were severely malnourished: “40 percent of the raw recruits drafted by our military proved to be so malnourished that they could not pass a physical and were declared unfit for duty” (Stephen Yafa, Grain of Truth, p. 17; quoted in Malnourished Americans). After the war, there was a public campaign with nutritional fortification of food and meal programs in schools, along with official dietary recommendations. It was also a time when obesity was finally seen as a public health crisis (Nicolas Rasmussen, Fat in the Fifties: America’s First Obesity Crisis).

At present, the military is once again acknowledging that this is a serious problem (Obese Military?). By law, the U.S. military is required to serve food that conforms to the U.S. dietary guidelines. Yet, despite military personnel having high levels of exercise, obesity is also increasing in the military. As research has shown, even when caloric intake and exercise is controlled for, the standard American diet (SAD) is obesogenic (Americans Fatter at Same Level of Food Intake and Exercise). But, on a positive note, the military is beginning to recognize the cause of the problem. They’ve determined the link the diet soldiers are being given. And research on soldiers has shown a ketogenic diet will help with fat loss.

The U.S. military is forced to be so honest because it’s simply not an option to have obese soldiers, much less soldiers experiencing neurocognitive decline. It’s only a question when other institutions of authority will catch up. There are signs that changes are already in the air (Slow, Quiet, and Reluctant Changes to Official Dietary Guidelines; & American Diabetes Association Changes Its Tune). After decades of blaming saturated fat, it’s becoming clear that the real culprit is carbohydrates and industrial seed oils; although other factors are involved in the general health crisis, such as possibly hormone mimics that are stunting male development (Real Issues Behind Regressive Identity Politics), but that is diverging from the immediate topic at hand.

The fact is the consumption of saturated fat has declined ever since, back in the 1930s, industrial seed oils replaced animal fats as the main source of fatty acids in the American diet. Likewise, beef intake has dropped about as low as it was in the first half of the 20th century, after a brief period of peaking out in the 1970s (Diet and Health, from John Hopkins Center for a Livable Future). Meanwhile, what has risen in the American diet, besides industrial seed oils, are mostly plant foods: vegetables, fruits, fruit juices, soda pop, grains, rice, legumes, nuts, and seeds. The only animal foods that have seen a significant increase are fish and chicken, the two supposedly healthy meats. That is the modern SAD diet that has led to the sudden appearance of “Thirties Brain”. Welcome to the new normal!

To make a related point, this health decline can’t be blamed on a factor like behavior, no matter how much lifestyle is implicated as well — you can’t outrun a bad diet, as some say. The young generations have become quite health-conscious, but it’s simply the health advice they’ve been given is wrong. Young adults are eating more supposedly healthy foods than did people in the past, including with rising rates of plant-based diets: Mediterranean, vegetarianism, veganism, etc. Also, when younger, Millennials (and Generation Z) had lower rates of teen sexual activity, alcohol consumption, and drug use. As observed elsewhere, one could call them prudes (Rate of Young Sluts) or at least that used to be true. But something has definitely changed that is now affecting their behavior.

After living through a major recession and a global pandemic, we are now seeing a rise of behavioral health issues among younger Americans with rising rates of self-medication, specifically alcohol and tobacco (Blue Cross Blue Shield Association study finds nearly one-third of millennials are affected by behavioral health conditions, Independence Blue Cross). Still, the rates of alcohol and tobacco consumption is now approximately the same as it was in the early 1900s, which was rather low compared to the later spike in the second half of the 20th century (graph from The Health Consequences of Smoking—50 Years of Progress: A Report of the Surgeon General; & Mona Chalabi, Dear Mona Followup: Where Do People Drink The Most Beer, Wine And Spirits?).

Some countries that with more alcohol and tobacco usage than the US are, nonetheless, healthier (France, Germany, etc). Limiting ourselves to the US, consider the residents of Roseto, Pennsylvania in their having been studied from 1954 to 1961. At the time, they were the healthiest population in the country, despite being quite fond of drinking and smoking, not to mention their love of processed meat and saturated fat like lard (Blue Zones Dietary Myth). So, a recent slight shift of drinking and smoking among Millennials also ends up being a non-explanation. It’s more likely a result of declining health than a cause, and hence the reason to describe it as self-medication. Or, more generally, the addictive mindset isn’t limited to addictive substances; and, besides, drug use is nothing new (The Drugged Up Birth of Modernity).

Anyway, keep in mind that these Millennial rates of substance abuse are still lower than was seen, for example, among Generation X that had far fewer health problems at the same age, even with GenXers being the most lead poisoned living generation. Something unique is going on right at present and it’s hard to explain it with anything other than a ultra-processed diet high in carbs and industrial seed oils. Back when the first wave of GenXers hit their thirties in the mid-1990s, no one was talking about “Thirties Brain”. And neither did it come up with the prior generations. We are complaining about U.S. presidents of the Silent Generation (Donald Trump and Joe Biden) in their seventies who have obvious neurocognitive decline, but that is a vast difference from one’s thirties.

To put that in further comparison, there was a discussion of health in terms of running. It was part of an argument that humans evolved for running. This is supported by the fact that persistence hunting (i.e., running game down) is one of the oldest and most widespread hunting techniques, as it requires almost no technology other than something to club or stab the animal to death after it collapses from heat exhaustion. The human body seems extremely well-adapted to long-distance running, especially in heat; and this also seems closely linked to the predilection for ketosis (Human Adaptability and Health). What is relevant for our discussion here is that hunter-gatherers reach their peak aerobic health in their fifties. The average middle-aged hunter-gatherer can outrun the average eighteen year old hunter-gatherer. Up into old age, hunter-gatherers can keep up a fast pace with others who are much younger.

Think about how many middle-aged or older Americans who could do the same. Unsurprisingly, hunter-gatherers likewise have very little of the diseases of civilization. Obesity, of course, is almost unheard of among them. The have what is called a long healthspan where most people live healthily into old age and suddenly die without any lingering sickness or long periods of degeneration. In such a healthy society, they likely wouldn’t even understand the concept of “Thirties Brain”.

* * *

Some might think Millennials are being unfairly criticized. That is not the intention. This health decline hardly began in recent decades. Weston A. Price and others were talking about it in the early 1900s. There was even a growing debate about it in the century before that, Heck, all the way back in the 1700s, people were recommending specific medical diets for obesity and diabetes, as it was already being observed that they were becoming more common. The only difference is that we are finally hitting a point of extreme consequences, as diseases of old age are now prevalent among the young, sometimes in early childhood.

We write posts like this with genuine concern and compassion. We are not disinterested observers, much less see ourselves as standing above these problems with condescension. It’s all rather personal. Though relatively healthy in many ways, we have experienced serious neurocognitive and mental health issues since our own childhood. And we suspect we previously were suffering from metabolic syndrome, if not yet diagnosed with any particular disease. To be specific about the point made in the parody video, we have experienced our own equivalent of “Thirties Brain”, as we had a memory-related learning disability that was diagnosed in third grade. For our entire lives, we’ve struggled with memory recall.

So, personal concern is underlying our public worries; magnified by the fact that our nieces and nephew span across the generations of Millennials and GenZ, allowing us to observe firsthand the health issues involved. From our own experience, we know what it’s like to be addicted to carbs and to suffer the consequences. We know what it’s like to struggle with serious mental illness, specifically depression with suicidal ideation, since young adulthood. It saddens us immensely to think that large numbers of Millennials will begin having so many harsh problems this early in life. That is a plain shitty situation, and Millennials did nothing to deserve it. Like the rest of us, they were simply born into this society with its food system and dietary recommendations.

For the most part, the majority of Millennials and other Americans have basically been doing what they were told is healthy. They don’t realize that what has been normalized should not be taken as normal because very few of them have anything to compare against. It’s not like most of us have ever lived among hunter-gatherers to realize how far human health has fallen. Even the traditional rural diet and lifestyle has mostly slipped from living memory. Certainly, hunting and fishing have become uncommon. Getting ultra-processed food from a grocery store or restaurant is simply what people do now.

* * *

44% of older millennials already have a chronic health condition. Here’s what that means for their futures
by Megan Leonhardt

Why insecure millennials are set for unhealthy middle age
by Greg Hurst

Gen X, Millennials in Worse Health Than Prior Generations at Same Age
by Amy Norton

Millennials less heart-healthy than Gen Xers at the same age
by Anicka Slachta

BCBSA: Millennials’ mental health is on the decline—and COVID-19 is making it worse
by Paige Minemyer

Millennials on Track to be Most Obese Generation in History
by Cathy Cassata

Diabetes’ Impact Is Rising Fastest Among Millennials
by Laura Entis

Study: Young adults with high cholesterol face greater risk of heart attack or stroke
by Ken Alltucker

The number of millennials with early-onset Alzheimer’s disease is surging, report finds
by Tracy Romero

Millennials may need to worry about autoimmune disease, right away
by Swedish Blogger

For millennials, cancers fueled by obesity are on rise, study says
by Sandee LaMotte

Study: Millennials’ Increased Risk for Some Obesity-Linked Cancers — 5 Takeaways
by Sandy McDowell

The coming of vegetables, fruits and key nutrients to the European diet
by V. J. Knapp

“On the basis of evidence now accumulating, vegetables and fruits were not always an integral part of the European diet. Prior to 1800, vegetables and fruits were not esteemed but rather looked down upon. It has only been over the past two centuries that these two critical foods have come into vogue. First, they had to be accepted by a growing number of medical men and observers. Then, once licensed as edible foods, vegetables and fruits, starting with the potato, actually did make their way into every man’s diet. And by the end of the nineteenth century, these rich sources of carotene and Vitamins A, C and E became so universal that Europeans now forgot that a hundred years earlier these foods had barely been consumed.”

What’s on your table? How America’s diet has changed over the decades
by Drew Desilver

What happens when you take public health advice to heart?
by Lena Zegher

Why are we fatter and sicker than ever? The graphs that explain how sugar, fruit juice and margarine are to blame
by Anna Hodgekiss

What fruits and vegetables looked like before
by Andreas Eenfeldt

Banana – before and after

banana1banana2

Carrot – before and after

carrot1carrot2

Watermelon – before and after

watermelan1watermelon2