Ancel Keys, One of Lewis Terman’s Termites

Unless you are seriously interested in diet and nutrition, you’ve probably never heard the name of Ancel Keys (1904-2004). Yet he was one of the most influential men of the 20th century, at least within the area of nutrition studies, government food policy, and official dietary recommendations. He developed the so-called ‘Mediterranean diet’, although it could more accurately be called a post-war scarcity and austerity diet, since we now know it has little in common with the pre-war traditional Mediterranean diet that prioritized lard and not olive oil. Because of his public campaign against animal fats and his research on heart disease, he was sometimes referred to in the press as ‘Dr. Cholesterol’, despite not being a doctor. He was academically successful and had a scientific background, but oddly considering his career path he had absolutely zero formal education and professional training in nutrition studies or in medicine. Instead, his extended higher education included chemistry, economics, political science, zoology, oceanography, biology, and physiology.

His career as a scientific researcher started in 1931 with a study on the physiology of fish and eels, his main area of expertise at the time, whereas his first work in diet and nutrition happened later on by the accident of historical circumstances. The US military sought to develop prepared rations for soldiers and, as no one else at the University of Minnesota wanted this lowly assignment, Keys at the bottom of the totem pole saw it as an opportunity and took advantage of it to promote his career. In his lack of requisite knowledge and expertise, according to his colleague Dr. Elsworth Buskirk, “he was told to go home and leave such things to the professionals,” but he persisted in obtaining funds and came up with something that met specifications (From Harvard to Minnesota: Keys to our History). This became what is known as the K-Ration. During the Second World War, he did much other work for the military and that paved the way for his entering the field of nutrition studies. It was through the military that he did research on humans with much of it focusing on extreme conditions of stress, from high altitudes to starvation. This led to a study on vitamin supplementation during that time period and after the war a prospective dietary study in 1947.

Yet Keys wouldn’t fully enter the fray of nutrition studies until the 1970s. He was about 70 years old when, in his battle with the British sugar researcher John Yudkins, he finally became a major contender in scientific debates. His controversial Seven Countries Study, although done in 1956, wasn’t published until decades later in 1978, almost 40 years after his first involvement in animal research. The height of his career extended into his 80s, having given him many decades to have mentored students, allies, and followers to carry on his crusade. He had a towering intellect and charismatic personality that gave him the capacity to demolish opponents in debate and helped him to dominate the media and political battlefield. Think of Keys as a smarter version of Donald Trump, as seen in an instinct for media manipulation of public perception, maybe related to Keys’ geographic and familial proximity to Hollywood: “As the nephew of silent screen star Lon Chaney, Keys also filmed all of his scientific work and was a first-rate publicist, frequently writing for popular audiences” (Sarah W. Tracey, Ancel Keys). He was a creature of the mass media that took hold during his lifetime.

Though now largely forgotten by the general public, Keys once was a famous figure whose picture was found on the covers of national magazines, from Time to Life. He personally associated and politically allied himself with many powerful politicians, health experts, and leading scientists. Whether or not you know of him, his work and advocacy shaped the world most of us were born into and he had a direct impact on the modern food system and healthcare practice that has touched us all. A half century ago, his fame was comparable to that of Dr. John Harvey Kellogg (1852-1943), a Seventh Day Adventists and eugenicist from an earlier generation who also worked in the field of diet and nutrition in having been one of the earliest vegans, in having invented breakfast cereal, and in having operated a sanitorium that was popular among the elite: politicians, movie stars, writers, and artists. Dr. Kellogg preached against race mixing, and warned of race degeneracy, and to promote eugenics he co-founded the Race Betterment Foundation that held several national conferences. He advocated the development of a “eugenic registry” to ensure “proper breeding pairs” that would produce “racial thoroughbreds,” but for inferior couplings he advised sterilization of “defectives.” Though coming from different ideological perspectives, Keys and Kellogg were the twin forces in shaping anti-fat ideology and, in this scapegoating of animal fats, shifted the blame away from sugar which was the actual cause behind metabolic syndrome (obesity, diabetes, heart disease, fatty liver, etc). That misdirection sent nutrition studies down a blind alley and misled public policymakers, a quagmire we are still in the middle of.

To be fair, it must be clarified that Keys never showed any proclivities toward eugenics, but I bring it up because there is a connection to be explored. As a child, he had tested as high IQ. After Keys’ parents “signed him up while he was a student at Berkeley High School,” according to Richard C. Paddock (The Secret IQ Diaries), he was given entrance into a study done by Lewis Terman (1877-1956) who was a noted psychologists and, like Dr. Kellogg, was an early 20th century racist: “He joined and served as a high ranking member in many eugenic organizations (the Human Betterment Foundation, the American Eugenics Society, and the Eugenics Research Association), and worked alongside many others (such as the American Institute of Family Relations and the California Bureau of Juvenile Research)” (Ben Maldonado, Eugenics on the Farm: Lewis Terman). In studying and working with gifted youths like Keys, Terman sought to prove the hypothesis of social Darwinism through eugenics (‘good genes’). He believed that such an ideological vision could be made manifest through a genetically superior intellectual elite who, if promoted and supported and given all the advantages a society could offer, would develop into a paternalistic ruling class of enlightened aristocracy with the potential of becoming humanity’s salvation as visionaries, creative geniuses, and brilliant leaders. It was a humble aspiration to remake all of society from the ground up.

This attitude, bigoted and socially conservative (e.g., prejudice against “sexual deviancy” in seeking to enforce traditional gender roles), was far from uncommon in the Progressive Era. Keep in mind that, at the time, ‘progressivism’ wasn’t solely or even always primarily identified with social liberalism. Among the strongest supporters of Progressivism were Evangelicals, Mormons, Klansmen, Jim Crow leaders, white supremacists, WASP elites, military imperialists, and fascists — think of one of the most famous of Progressive leaders, President Theodore Roosevelt, who was a racist and imperialist; and even his distant cousin, the Progressive President Franklin Delano Roosevelt, was not without racist and imperialist inclinations. Progress back then had a different connotation, and many of these American eugenicists were a direct inspiration to Adolf Hitler and other Nazi leaders. After all, the enactment of progressive Manifest Destiny was still playing out in the last of Indian Wars all the way into the 1930s, before the remaining free Indians were finally put down. The proto-neocon Civilizing Project was long and arduous and more than a bit bloody. This ideology continued even after the defeat of the Nazis, as sterilization of perceived inferiors in the United States was still practiced for decades following the end of Second World War, all the way into the 1970s. Eugenics has been persistent, to say the least.

Inspired by this idealistic, if demented and distorted, ideology of evolutionary advancement and Whiggish progress, Terman invented the Stanford-Binet IQ Test. During the First World War, he worked in the military to implement the first mass testing of intelligence. His own IQ test was the initial attempt to scientifically measure what is now called general intelligence or the g factor but for which he coined the term “intelligence quotient” (IQ), a mysterious essence that many at the time believed to be inherent to the individual psyche from birth, as genetically inherited from one’s parents. The Stanford-Binet was a measure of academic ability or what today we might think of as ‘aptitude’ — specifically having assessed attention, memory, and verbal skill in measuring ability in arithmetical reasoning, sentence completion, logics, synonyms-antonyms, symbol-digit testing, vocabulary, analogies, comparisons, and general information. The focus was on crystallized intelligence, but it was also culturally biased and coded for socioeconomic class.

The Stanford-Binet was modeled after the intelligence test of the French psychologist Alfred Binet. There was a significant difference, though. Binet used his test to identify those most in need in order to help them improve, whereas Terman saw these ‘deficient’ children as a danger to society that should be eliminated, quite literally with sterilization — this had real world application and consequences: “Terman’s test was also used regularly to determine who should be sterilized in the name of eugenics: individuals with an IQ of under 70 (deemed feebleminded) were targeted for sterilization by the state, such as in the famous case of Carrie Buck. In the United States, over 600,000 people were sterilized by the state for eugenic reasons, often because of IQ test results. For many eugenicists, Terman’s research finally presented a way to efficiently and “objectively” judge the eugenic worth of human lives” (Ben Maldonado, Eugenics on the Farm: Lewis Terman). Instead of helping the poor and disadvantaged, he hoped to use his own adaptation of Binet’s test to identify the smart kids so as to ensure they would become high achievers in gaining the success and respect they supposedly deserved. This was a response to his own childhood struggles as a sickly nerd growing up among other farm kids in rural Indiana.

By the way, this was the specific area that later on would become the stronghold of the Second Klan with the Indiana Grand Dragon D.C. Stephenson having set up base in Terman’s old hometown. The Second Klan rose to power at the very moment the adult Terman, having left Indiana, began his eugenicist project of IQ testing. That was no coincidence. Following upon a period of moral panic, there was a mix of fear and hope about the future and, central to public debate, threats to the survival of the white race was a major concern (The Crisis of Identity). The purpose of eugenics was basically to show that the right kind of people were a special breed of humans that, in eliminating what held back their genetic potential, would rise up to make America great again and so return Western Civilization to its previous glorious heights. The agenda, of course, wasn’t to create a fair and objective measure of human worth and human potential for the assumptions it was built upon presupposed the race and class of people who, by definition, were the best of the best. Terman was simply seeking to prove what he already ‘knew’ as a true believer of social, moral, mental, and racial hygiene.

With this hope in mind, Terman began in 1921 to gather a large group of children who scored high on his IQ test, a total of 1,521 subjects, including the teenage Ancel Keys. His selection process was highly subjective and idosyncratic. It just so happened that, among a total sample of 168,000 students, Terman included only 6 Japanese-Americans, 2 African-Americans, 1 Native American, and 1 Mexican-American. The vast majority of those chosen were white, urban, and middle class boys largely drawn from the college towns and suburbs of Northern California. These were known as Terman’s kids or ‘Termites’. Betraying scientific objectivity, he intervened in the lives of his subjects, sometimes openly but also behind the scenes. He followed these subjects into adulthood to find out how they turned out and to ensure they gained advantages, such as his having written letters of recommendation for college entrance, job applications, and professional contacts. The eugenics project was not a passive endeavor of neutral scientific observation.

Whatever is to be thought of it, there is no doubt that the study of Terman’s children was the first and maybe only time a hypothesis of social Darwinian eugenics was so fully tested at such an ambitious level. In general across all scientific fields, there is no other longitudinal study that lasted so long and, as some of the subjects remain alive, there are scientists carrying on the work to this day with the last of the surviving Termites still dutifully filling out the surveys sent to them (Ancel Keys remained a participant into his 90s until his death in 2004, two months shy of his 101st birthday). One has to give Terman credit for having dared to scientifically test his belief system in a falsifiable study, ignoring the problems with confounding factors. He put his convictions on the line, although Hitler was even more ambitious in using war as a test of sorts, forcing an end result of either total domination or total destruction, to prove or disprove the hypothesis of German racial supremacy. I guess we can be grateful that Terman took a less violent approach of scientific analysis that didn’t require vast desolation of battlefields and doctors experimenting on unwilling victims in concentration camps.

Terman’s decades-long experiment, continuing as it did into the post-war period, ended in failure by his own standards of expectation. Before his death in 1956, he was able to see how few of the children grew up to amount to much, beyond many of them becoming moderately successful middle class professionals, although a few attained some prominence: “Among some of the original participants of the Terman study was famed educational psychologist Lee Chronbach, “I Love Lucy” writer Jess Oppenheimer, child psychologist Robert Sears, scientist Ancel Keys, and over 50 others who had since become faculty members at colleges and universities” (Kendra Cherry, Are People With High IQs More Successful?). In Cradles of Eminence, Victor and Muriel Goertzel analyzed the Termites according to eminence, defined as having multiple biographies written about someone without their being either royalty or a sports star. It turns out none of Terman’s subjects had even a single biography written about them. Crystallized intelligence, at best, moderately predicted being professionally successful and conforming well within the social order. However, once later tests removed the cultural and class biases, IQ tests stopped even being useful for predicting even this much. When environmental factors and family background are controlled for, almost all IQ differences disappear. A lower IQ rich person is more likely to be successful than a higher IQ poor person. Surprise, surprise!

Interestingly, in comparing the Termites to their peers, “two children who were tested but didn’t make the cut — William Shockley and Luis Alvarez — went on to win the Nobel Prize in Physics. According to Hastorf, none of the Terman kids ever won a Nobel or Pulitzer” (Mitchell Leslie, The Vexing Legacy of Lewis Terman). It’s ironic that Shockley later followed Terman’s example by also having become a eugenicist and, through his friendship with Terman’s son Frederick, was hired on as a professor at Stanford where the senior Terman had done his scientific work, the reason his IQ test was called the Stanford-Binet. Shockley and Frederick Terman came to be known as the fathers of Silicon Valley in having developed the high tech start-up model and in having played a central role in bringing in the massive Pentagon funding that has defined and dominated the American tech industry ever since (e.g., Jeff Bezos sitting on a Pentagon board and with numerous government contracts). Social Darwinism, intellectual elitism, and paternalistic technocracy remains the ascendant ideology of Silicon Valley tech bros and the capitalist class of entrepreneurial philanthropists who seek to shape society with their gifted genius, not to mention their wealth (e.g., Bill Gates and the Gates Foundation).

Lewis Terman privately admitted that some of his strongest bigoted views were wrong, but unlike many other eugenicists he never publicly recanted his earlier racism. Nonetheless, he was honest enough to conclude that a pillar of eugenicist dogma was flat-out wrong, in having stated that, “At any rate, we have seen that intellect and achievement are far from perfectly correlated.” Of the 730 subjects he was able to follow into adulthood, he divided them into three groups: In Group A of those he deemed successful, only 20% of the kids were categorized. He judged an equal number, 20%, to fall into a Group C of failures. Most of them fell into the middle Group B that included those working in positions “as humble as those of policemen, seaman, typist and filing clerk.” That is rather unimpressive. Writing about this, one person noted that, “The ones among Group A overwhelmingly were from the upper class. The Cs were majorly from the lower class. Majority in the group had careers that were quite ordinary. […] Sociologist Pitirim Sorokin, in his critique of the study, argued showing that Terman’s selected group of children with high IQs did about as well as a random group of children selected from similar family backgrounds would have done.”

Beyond the unsurprising prediction that wealthier people with better chances have better outcomes, the predictive ability of his IQ test was completely off the mark. The Termites, for all their test-taking ability, showed no advantages over the general population. The IQ test did demonstrate academic ability, for whatever that is worth. Among Termites, the rate of college graduates was extremely high (70%, ten times that of their peers), but on average they still were only getting B grades in their classes and a college degree didn’t translate to greater real world accomplishment. They were smart, even if no more successful than their socioeconomic equivalents. If they were wealthy, they did as well as other wealthy people. And if they weren’t wealthy, then they followed the typical path of underachievement. Supposed superior genetics offered no protective advantages beyond the social, racial, and economic privileges given or denied in the lottery of birth.

Even among the successful Termites, there was nothing unusual to be praised. “Rebels were scarce among the Termites, and Henry David Thoreau’s different drummer would have found few followers,” wrote Shurkin in Terman’s Kids. “They did not change life; they accepted it as it came and conquered it.” As good test-takers and students, they were the ultimate conformists, well-lubricated cogs in the machine. They knew how to play the game to win, but the game they played, that of mainstream success and conventional respectability, had rules they followed. These weren’t the types to rock the boat. Rather, Termites were simply well-educated sheep (see: A Ruling Elite of Well-Educated Sheep; & William Deresiewicz, Excellent Sheep). “This is unsurprising,” Elizabeth Svoboda points out, “given that the kinds of people who ace aptitude tests are, by definition, those specialising at jumping through the hoops that society has set up. If you believe that your entire purpose on Earth is to finish the course, chances are you’ll remain within its boundaries at all costs” (The broad, ragged cut).

As expected, the single greatest factor is environment. It’s not so much who we are, as if we have an inborn psychological profile where character is fate, since who we are is dependent on where we are (Dorsa Amir, Personality is not only about who but also where you are), although we know from epigenetic research it also matters where were our parents, grandparents, great-grandparents, and on back as environmental factors carry forward in our family inheritance, such as the grandchildren of famine victims having higher rates of obesity. The world is complex and humans are shaped by it. Despite Terman’s ideological failure, many aptitude tests were based upon this model. Our entire education system has since been redesigned to teach to such tests and as a filtering process for educational advancement, in an assumption of a pseudo-meritocratic dogma not all that different from Terman’s eugenicist dream of a better humanity.

As with fascism, the dangers and harms of eugenics linger on within our institutions and within our minds. We are trapped within false and misleading systems of ideological realism. That isn’t particularly smart of us, as individuals and as a society. We’d be better off promoting the development and opportunities of the majority (James Haywood Rolling, Jr., Swarm Intelligence), rather than investing almost all of society’s resources in a privileged elite who we desperately hope will be our salvation. Considering the national and global failure among the ruling class and capitalist plutocrats, maybe we should create a citizenry that can solve their own problems. Basically, maybe we should take seriously democracy, really and fully try it for the first time, not as superficially inspiring rhetoric to cling to in the darkness but as a lived reality. As ambitious experiments go, democracy is definitely worth attempting.

Up to this point, the democratic experiment has been more of a hypothesis waiting to be tested. The oligarchic and filthy rich American ruling elite, for some reason, have never been in favor of testing the potential for self-governance among the American people. Eugenics has been more thoroughly researched over the past century than has liberty and freedom. That speaks volumes about American society. But it isn’t only about eugenics, as authoritarian elitism and paternalism has taken many other forms. Let’s bring it back to Ancel Keys. Even though he was one of the Elect personally groomed by Lewis Terman to be a leading member of the master race, Keys rejected “Terman’s hereditarian bias” and thought that “personal will. . . is a greater factor in success than inherited intelligence” (Richard C. Paddock, The Secret IQ Diaries).

Even so, it appears that Keys carried on the sense of personal superiority that Terman helped instill in him. As part of the supposed meritocracy, he didn’t feel a need to humbly seek to make scientific advancements in workman-like fashion of careful research and cautious analysis. He had such immense confidence in knowing he was right and that inferior minds were wrong that he saw no need for scientific debate and, instead, used his political power and media influence to effectively shut down debate by silencing his opponents. As a self-identified genius imbued with noblesse oblige (with great power comes great responsibility), he wanted to change the world and had the zealous conviction to enforce his will upon others. It was irrelevant that he dismissed the idea that his elitist worth was based on genetics, as it was the same difference no matter what he believed was the justification for his dogmatic mission of dietary evangelism (The Creed of Ancel Keys). Following in the footsteps of Lewis Terman, he aspired to be a paternalistic technocrat who would save the lesser folk from their wrong thinking and behavior. He simply knew what was right.

Ancel Keys, in embracing his role as part of the wise ruling class, ended up being the greatest success story of Lewis Terman’s eugenics project. He also demonstrated its failure, in that it turns out that being smart is not enough. He was brilliant in his aggressive displays of intellectual prowess and he was successful in his professional achievement by climbing the ladder of power and prestige, but he was neither a creative genius nor a a visionary leader. Instead of thinking outside of the box, he forced everyone else into the box of his ideological biases that commanded the stunting effect of groupthink among several generations of scientific researchers and health experts, nutritionists and doctors. Maybe we should be unsurprised by this unhappy result (Quickie Post — Young Prodigies Usually Do Not Turn into Paradigm-Shifting Geniuses).

It could be argued that, at least in this case, the name of ‘Termite’ was aptly descriptive of the harm caused society. Now we are all suffering for it in the tragedy of our ever worsening public health crisis. And as if that weren’t bad enough, we have a new generation of paternalistic overlords who are repeating the same mistake in once again trying to enforce dietary dogma from up on high (Dietary Dictocrats of EAT-Lancet), in being led by Walter Willett who is the direct heir of Ancel Keys. The experiment of elite rule goes on and on.

* * *

The broad, ragged cut
by Elizabeth Svoboda

Despite initial resistance, the public accepted the notion of a test-driven meritocracy because it twined together two established strands of thought: first, that the spoils should go to the declared winner, and second, that high-performers’ abilities should be harnessed for the good of the nation. ‘To each according to their ability’ became the tacit watchword, a neat variant of the Marxist injunction ‘to each according to their need’.

The first aptitude-testers promoted the idea that each person had an innate, more-or-less fixed intellectual capacity. In the context of the early 20th century’s growing eugenics movement, the tests were often deployed to justify widespread racial discrimination. Terman claimed that what he called borderline deficient scores on the Stanford-Binet were ‘very, very common among Spanish-Indian and Mexican families of the Southwest and also among Negroes’. ‘Children of this group should be segregated into separate classes,’ he wrote in 1916. ‘They cannot master abstractions but they can often be made into efficient workers … From a eugenic point of view they constitute a grave problem because of their unusually prolific breeding.’ In Terman’s mind, then, low IQ scores were simply and unarguably the result of objective deficiency.

We now understand just how wrong that notion was. Today, many psychologists understand IQ and aptitude tests to be ‘culture-bound’ to one degree or another – that is, they evaluate abilities prized in the dominant Western culture, such as sorting items into categories, and can privilege those raised in that milieu. Such inequities have persisted despite attempts to make the tests fairer to those from non-dominant cultures.

As the US marinated in social Darwinism after the First World War, the government began devising its own sinister solution to the ‘grave problem’ of which Terman had warned. The US Supreme Court case Buck v Bell in 1927 ruled for compulsory sterilisation of the ‘feeble-minded’ in the name of public welfare. For more than four decades thereafter, US states sterilised thousands of people with low IQ scores; a disproportionate number of victims were nonwhite. In later years, though aptitude tests’ eugenic roots would fade from view, the ranking of test-takers according to perceived social value would continue unabated.

The History of Eugenics in America, Part II
by Steven Vigdor and Tim Londergan

In light of our current knowledge of nutrition and fitness, we now view J.H. Kellogg’s practices as a combination of exceptional insight, mixed with positively bizarre notions on medicine and health. The field of eugenics also combined significant advances in applied science with a set of misguided, and in some cases tragic, biases and prejudices.

J.H. Kellogg was an enthusiastic proponent of eugenics, and in 1911 he established the Race Betterment Foundation in Michigan. That Foundation held three national conferences on Race Betterment in 1914, 1915 and 1928. The Race Betterment congresses allowed advocates of eugenics to share their suggestions for the most effective practices that would lead to maintaining or improving ‘racial purity.’ Kellogg himself had a complicated relationship with the notion of racial purity, particularly with respect to blacks. He and his wife had no children, so over the course of their lives they raised a large number of foster children; this included a number of black youths.

On the other hand, Kellogg was a strong supporter of segregation and a firm believer that different races should not mix. Here Kellogg adopted a common theme from the eugenics movement that Nordics, Mediterraneans, Alpines, Mongolians and blacks all represented different ‘races.’ Kellogg warned of “the rapid increase of race degeneracy, especially in recent times,” and urged the adoption of steps that he claimed would result in the “creation of a new and superior human race.”

With his characteristic energy and ambition, Kellogg proposed a multi-step plan to save the U.S. from a calamitous fate. His plan included “a thoroughgoing health survey to be conducted in every community every five years, free medical dispensaries for the afflicted, the inspection of schools and schoolchildren, health education, prohibition of the sale of alcohol and tobacco, strict marriage laws in every state, and the establishment of experiment stations [devoted] to investigating the laws of heredity in plants, animals, and humans.”

A central feature of Kellogg’s plan was the creation of a ‘eugenic registry’ that would establish criteria for ‘proper breeding pairs.’ The idea was that individuals would provide their credentials to a central clearinghouse. Males who met the highest standards for racial ‘fitness’ would be paired with similarly ‘fit’ females and encouraged to marry (the idea was clearly inspired by similar matings with ‘pedigreed’ dogs and ‘bloodlines’ for horses).

Kellogg proposed central record-keeping offices for family pedigrees and the establishment of contests for ‘best babies’ and ‘fittest families.’ A few years later, such contests became common at state fairs across the U.S., as we will describe in the next section. In addition to the fairly sinister aspect of ‘racial purity,’ such contests also placed an emphasis on wellness, and offered useful tips on healthy diets and nutrition for young children. […]

The study of intelligence testing was then taken up by scientists such as Lewis Terman and Robert Yerkes. Terman, a psychologist at Stanford, made major revisions in Binet’s tests. He organized the tests into two parts. Part A included sections on arithmetical reasoning, sentence completion, logics, a synonym-antonym section, and a symbol-digit test. Part B included sections involving sentence completion, vocabulary, analogies, comparisons, and general information. Terman and associates tried out their tests on numerous cohorts of school children. Their aim was to determine the average performance of children in each grade from 3 to 8, and to administer the test to as many students as possible. They also performed numerous statistical tests, and arranged the grading to achieve an average of 100 for every grade, with a standard deviation of 15. The resulting “Stanford-Binet” test fairly rapidly became the standard in the field.

Terman was quite candid about his motives for universal testing. “It is safe to predict that in the near future intelligence tests will bring tens of thousands of those high-grade defectives under the surveillance and protection of society. This will ultimately result in curtailing the reproduction of feeble-mindedness and in the elimination of an enormous amount of crime, pauperism, and industrial inefficiency.” So, while Binet had insisted that his tests be administered only to provide assistance in improving the skills of slow learners, Terman and his hereditarian brethren were determined to identify, isolate and stigmatize precisely this group of children. Terman had no doubt that his tests represented measurements of innate intelligence, and that intelligence was almost entirely determined by heredity. “The children of successful and cultured parents test higher than children from wretched and ignorant homes for the simple reason that their heredity is better.”

Terman also recommended that businesses use IQ tests in hiring decisions. He argued that “substantial success” as a leader required an IQ of at least 115 to 120. Furthermore, people with IQs below 100 should not be hired for demanding or high-paying jobs. Terman was even more specific: people with IQ below 75 should only be qualified for menial tasks, and the 75-85 level for semi-skilled labor. People with an IQ of 85 or lower should be tracked into vocational schools, so that they would not leave school and “drift easily into the ranks of the anti-social or join the army of Bolshevik discontents.”

The Vexing Legacy of Lewis Terman
by Mitchell Leslie

Terman, who had grown up gifted himself, was gathering evidence to squelch the popular stereotype of brainy, “bookish” children as frail oddballs doomed to social isolation. He wanted to show that most smart kids were robust and well-adjusted — that they were, in fact, born leaders who ought to be identified early and cultivated for their rightful roles in society.

Though the more than 1,000 youngsters enrolled in his study didn’t know it at the time, they were embarking on a lasting relationship. As Terman poked around in their lives with his inquisitive surveys, “he fell in love with those kids,” explains Albert Hastorf, emeritus professor of psychology. To the group he always called “my gifted children” — even after they grew up — Terman became mentor, confidant, guidance counselor and sometimes guardian angel, intervening on their behalf. In doing so, he crashed through the glass that is supposed to separate scientists from subjects, undermining his own data. But Terman saw no conflict in nudging his protégés toward success, and many of them later reflected that being a “Terman kid” had indeed shaped their self-images and changed the course of their lives. […]

A story of a different kind emerges from Terman’s own writings — a disturbing tale of the beliefs of a pioneer in psychology. Lewis Terman was a loving mentor, yes, but his ardent promotion of the gifted few was grounded in a cold-blooded, elitist ideology. Especially in the early years of his career, he was a proponent of eugenics, a social movement aiming to improve the human “breed” by perpetuating certain allegedly inherited traits and eliminating others. While championing the intelligent, he pushed for the forced sterilization of thousands of “feebleminded” Americans. Later in life, Terman backed away from eugenics, but he never publicly recanted his beliefs. […]

Many who did well in their fields had received no boost from Terman beyond an occasional pat on the back and the knowledge that they’d qualified for his study. For others, like Dmytryk, Terman’s intervention was life-changing. We’ll never know all that he did for his kids, Hastorf notes. But it’s clear that Terman helped several get into Stanford and other universities. He dispatched numerous letters of recommendation mentioning that individuals took part in his project. And one time, early in World War II, he apparently pulled strings on behalf of a family of Japanese-Americans in his study. Fearing they were about to be interned, they wrote to Terman for help. He sent a letter assuring the federal government of their loyalty and arguing against internment. The family remained free.

From a scientific standpoint, Terman’s personal involvement seems foolish because it probably skewed his results. “It’s what you’d expect a mentor to do, but it’s bad science,” Hastorf says. As a conscientious researcher whose work got him elected to the National Academy of Sciences, Terman should have known better — but he wasn’t the first or last to slip. Indeed, the temptation to meddle is an occupational hazard among longitudinal researchers, says Glen Elder Jr., a sociologist at the University of North Carolina. A certain degree of intimacy develops, he explains, because “we’re living in their lives and they’re living in ours.”

It’s difficult to gauge Terman’s influence on the kids because so many are deceased or still anonymous. One survivor willing to speak on the record is Russell Robinson, a retired engineer and former director of aeronautical research at NASA Ames. He was a high school student in Santa Monica when, he recalls, “someone in the school system tapped me on the shoulder and said, ‘Dr. Terman would like to test you, if you’re willing.'” Robinson, now 92 and living in Los Altos, doesn’t think being in the study significantly changed his life, but he did draw confidence from knowing that Terman thought highly of him. Several times during his career, he mentally invoked Terman to shore up his self-image. “Research is a strange business — in a sense, you’re out there alone,” he says. “Sometimes, the problems got so complex I would ask myself, Am I up to this? Then I would think, Dr. Terman thought I was.”

Others have echoed that sentiment, Hastorf says. In fact, the study meant so much to some of the subjects that the Terman project now runs entirely on their bequests.

Several Terman kids have cited a negative impact on their lives. Some complained of being saddled with an unfair burden to succeed, Hastorf says, while others thought that being dubbed geniuses at an early age made them cocky and complacent. For better or worse, a quarter of the men and almost a third of the women said they felt that being a Terman kid had changed their lives. And since Terman often did his meddling behind the scenes, others may have been influenced without ever realizing it.

His support of the gifted was heartfelt, but an equally fundamental part of Terman’s social plan was controlling the people at the other end of the intelligence scale. Both were aims of eugenics, a movement that gained momentum early in the 20th century.

The eugenicists of Terman’s day held that people of different races, nationalities and classes were born with immutable differences in intelligence, character and hardiness, and that these genetic disparities called for an “aristogenic” caste system. Traits like feeblemindedness, frailty, emotional instability and “shiftlessness,” they believed, were controlled by single genes and could be easily eliminated by controlling the reproduction of the “unfit.” In the United States, the movement peddled a topsy-turvy form of Darwinism, claiming that the “fittest” (defined as well-to-do whites of Northern European ancestry) were reproducing too slowly and in danger of being overwhelmed by the inferior lower strata of society. America was jeopardized from within, eugenicists warned, by the rapid proliferation of people lacking intelligence and moral fiber. From without, the threat was the unchecked arrival of immigrants from southern and eastern Europe. Together these groups would drag down the national stock.

Terman’s letters and published writings show that he shared these beliefs and argued for measures to reverse society’s perceived deterioration. He was a member of the prominent eugenics societies of the day. “It is more important,” he wrote in 1928, “for man to acquire control over his biological evolution than to capture the energy of the atom.” Yet he wasn’t a renegade howling from the fringe. Eugenics was “hugely popular in America and Europe among the ‘better sort’ before Hitler gave it a bad name,” as journalist Nicholas Lemann puts it. Luminaries who supported at least part of the early eugenic agenda include George Bernard Shaw, Theodore Roosevelt, Margaret Sanger, Calvin Coolidge and Oliver Wendell Holmes Jr. In fact, Terman sat on the boards of two eugenics organizations with Stanford’s first president, David Starr Jordan.

Early eugenicists managed to push through several laws. Thirty-three states, including California, passed measures requiring sterilization of the feebleminded. As a result, more than 60,000 men and women in mental institutions were sterilized — most against their will and some thinking they were getting an emergency appendectomy. In 1924, Congress set quotas that drastically cut immigration from eastern and southern Europe. Though pressure to stem immigration had come from many sources, including organized labor, the quotas had an undeniably racist taint. Terman cheered these efforts.

During the 1930s, as the brutality of Nazi policies and the scientific errors of eugenic doctrines became clearer, the eugenics movement withered in the United States and Terman inched away from his harshest views. Later in life, he told friends he regretted some of his statements about “inferior races.” But unlike several prominent intelligence-testers, such as psychologist Henry Goddard and sat creator Carl Brigham, Terman never publicly recanted.

At least one eugenic measure proved as stubborn as he was. News of the Nazis’ mass sterilization program did not put an end to the practice in the United States, where sterilizations of the mentally ill and retarded continued well into the 1970s.

Terman left a difficult legacy. On one hand, his work inspired almost all the innovations we use today to challenge bright students and enrich their education. As he followed the lives of intelligent kids, he also became their best publicist, battling a baseless prejudice. As a scientist, he devised methods for assessing our minds and behaviors, helping put the field of psychology on an empirical and quantitative foundation. He was one of Stanford’s first nationally prominent scholars, and as a department chair for two decades, he transformed the psychology department from a languid backwater into an energetic, top-ranked program. He established the longitudinal method and generated an archive of priceless data. Longitudinal studies have “become the laboratory of the social sciences” and are growing in importance as the population ages, unc sociologist Elder observes.

On the other hand, as biographer Minton points out, the very qualities that made Terman a groundbreaking scientist — his zeal, his confidence — also made him dogmatic, unwilling to accept criticism or to scrutinize his hereditarian views. A similar paradox existed in his social agenda. Terman was a visionary whose disturbing eugenic positions and loving treatment of the gifted grew out of the same dream for an American meritocracy.

“He was a very nice guy, but I have some things I would argue with him about,” Hastorf declares. His conclusion is that Terman was as much a product of his time as a force for change — and that, like many powerful thinkers, he was complex, contradictory and not always admirable.

The Parable of the Talents
by Scott Alexander
from comment section:

Harald K says:
“The IQ pioneers were social reformers who wanted to reduce human suffering.”

Oh sure. By turning as much as possible of decision making over to them, or resisting efforts to take away the privileges they already had, i.e. egalitarian efforts. I don’t hold much faith in the good will of US eugenicists, any more than their German cousins. The decision of which of other people’s genes deserve to survive to the next generation, is one which every human is hopelessly biased, and every decision is hopelessly corrupt.

Sure, many socialists were fooled too by the eugenicists’ crocodile tears for humanity, but it’s an inherently and irreparably selfish practice, only morally compatible with every man for himself/might makes right morality. I could have told them (and many DID tell them).

Binet can get a pass, sort of. His concern was mainly about who would do well in the French school system. Goddard imported the Binet test before Terman turned it into the first IQ test, so hardly “the man who brought IQ tests to America”. I wonder who can look at Goddard’s wikipedia page for arguments that he had such noble intentions, and overlook how he argued that Americans were unfit for democracy, or how he let first and second class skip the intelligence testing for immigration demand on Ellis Island.

IQ tests were invented in America, by Lewis Terman. From the moment Terman touched the test, it was conscripted to the service of racism and elitism.

“Does that sound like radical antihumanism? Nope.”

Yes, it does. Note how it promotes the welfare of humanity in the abstract, at the expense of concrete humans living here and now. But as I’ve argued, even that is just a fig leaf for the crudest power-grab a biological human can possibly make.

Harald K says:
“And ended up making a tool that predicts that the Chinese and Jews should be doing it instead of them.”

Ah, here it becomes relevant that the IQ of today isn’t really Terman’s IQ. Today’s test make Chinese people look good, but Terman’s test didn’t. It didn’t try to be culturally independent at all, so if you administered it to a Chinese person, he’d score horribly. There were even questions which obviously coded for social class, like where would you go to buy certain products.

It was in response to such criticism that they gradually tried to make the tests more independent of culture and language. It was not such a great sacrifice for them to open up for the possibility that some groups may on average do slightly better than your group, once the tests had scientifically established that they, individually, were superior beings.

But as they did so, the tests became less useful for prediction of success. (It turns out upper class white kids are more successful than kids who go to the liquor store to buy sugar, even if the latter kids are otherwise clever. Who knew?).

Elof Carlson says:
There are several difficulties with using a single number to measure intelligence, in a spectrum running from retarded to genius. Issue one is the diversity of talents. As you point out musical genius is not correlated to IQ test genius because there are many people in the 160 plus range who have little music appreciation n or talent. My mentor, HJ Muller, had a 165 IQ measured by Anne Roe, but he had no ear for music. The same might be true for artistic expression among museum quality artists. It might also be true for creativity. The second issue is the role of home environment. This varies a lot. In general those in poverty have lower IQ scores than those who have wealthy home environments. Premeds who take MCAT Kaplan courses do better than those who do not. Those who go to elite private schools do better than those who go to public high schools. Having a private tutor helps even more. The wealthy can afford such luxuries for their children. The poor cannot.

A book that changed my mind about the usefulness of IQ scores was Cradles of Eminence by Victor and Muriel Goertzel. They wanted to compare Terman’s study of 1000 high IQ California kids with eminence. They defined eminence as having two or more biographies written about a person who is not royalty or a sports figure. They found that none of the Terman kids had biographies written about them. They mostly became health professionals, CEOs, lawyers, engineers and solid middle class and contented adults. They found that those who had biographies written about them often had unstable middle class homes (e.g., a neurotic or psychotic parent, an alcoholic parent, a financial collapse in business leading downward in social class, a parent who was a zealot for a cause). They argued that it was the conflict at home (the parents were nevertheless loving to their children) that led these students to creative activities that set them apart. The Terman kids were teachers’ pets, loved school, and aced all their tests. The Goertzel biographees often disliked school (they were bored by it), were often misinterpreted by their teachers as lazy or mentally disturbed or nonconforming. Very few of the high IQ Terman kids were in the arts or wrote fiction. Many of the Goertzel biographees had careers in the arts (but about a majority of both groups chose science careers). None of the Terman kids won a Nobel or Pulitzer. Numerous of the Goertzel biographees did win Nobels and Pulitzers.

I hope you will read that book and comment on it. I believe IQ measures effectiveness in test-taking. That may be innate. It certainly has value in who gets into medical school or who succeeds academically. I believe creativity is independent of IQ score and no one has developed an objective quantitative measure of that creativity in whatever field people excel.

American Heart Association’s “Fat and Cholesterol Counter” (1991)

  • 1963 – “Every woman knows that carbohydrates are fattening, this is a piece of common knowledge, which few nutritionists would dispute.”
  • 1994 – “… obesity may be regarded as a carbohydrate-deficiency syndrome and that an increase in dietary carbohydrate content at the expense of fat is the appropriate dietary part of a therapeutical strategy.”*

My mother was about to throw out an old booklet from the American Heart Association (AHA), “Fat and Cholesterol Counter”, one of several publications they put out around that time. It was published in 1991, the year I started high school. Unsurprisingly, it blames everything on sodium, calories, cholesterol, and, of course, saturated fat.

Even hydrogenated fat gets blamed on saturated fat, since the hydrogenation process turns some small portion of it saturated, which ignores the heavy damage and inflammatory response caused by the oxidization process (both in the industrial processing and in cooking). Not to mention those hydrogenated fats as industrial seed oils are filled with omega-6 fatty acids, the main reason they are so inflammatory. Saturated fat, on the other hand, is not inflammatory at all. This obsession with saturated fat is so strange. It never made any sense from a scientific perspective. When the obesity epidemic began and all that went with it, the consumption of saturated fat by Americans had been steadily dropping for decades, ever since the invention of industrial seed oils in the late 1800s and the fear about meat caused by Upton Sinclair’s muckraking journalism, The Jungle, about the meatpacking industry.

The amount of saturated fat and red meat has declined over the past century, to be replaced with those industrial seed oils and lean white meat, along with fruits and vegetables — all of which have been increasing.** Chicken, in particular, replaced beef and what stands out about chicken is that, like those industrial seed oils, it is high in the inflammatory omega-6 fatty acids. How could saturated fat be causing the greater rates of heart disease and such when people were eating less of it. This scapegoating wasn’t only unscientific but blatantly irrational. All of this info was known way back when Ancel Keys went on his anti-fat crusade (The Creed of Ancel Keys). It wasn’t a secret. And it required cherrypicked data and convoluted rationalizations to explain away.

Worse than removing saturated fat when it’s not a health risk is the fact that it is actually an essential nutrient for health: “How much total saturated do we need? During the 1970s, researchers from Canada found that animals fed rapeseed oil and canola oil developed heart lesions. This problem was corrected when they added saturated fat to the animals diets. On the basis of this and other research, they ultimately determined that the diet should contain at least 25 percent of fat as saturated fat. Among the food fats that they tested, the one found to have the best proportion of saturated fat was lard, the very fat we are told to avoid under all circumstances!” (Millie Barnes, The Importance of Saturated Fats for Biological Functions).

It is specifically lard that has been most removed from the diet, and this is significant as lard was a central to the American diet until this past century: “Pre-1936 shortening is comprised mainly of lard while afterward, partially hydrogenated oils came to be the major ingredient” (Nina Teicholz, The Big Fat Surprise, p. 95); “Americans in the nineteenth century ate four to five times more butter than we do today, and at least six times more lard” (p. 126). And what about the Mediterranean people who supposedly are so healthy because of their love of olive oil? “Indeed, in historical accounts going back to antiquity, the fat more commonly used in cooking in the Mediterranean, among peasants and the elite alike, was lard.” (p. 217).

Jason Prall notes that long-lived populations ate “lots of meat” and specifically, “They all ate pig. I think pork was the was the only common animal that we saw in the places that we went” (Longevity Diet & Lifestyle Caught On Camera w/ Jason Prall). The infamous long-lived Okinawans also partake in everything from pigs, such that their entire culture and religion was centered around pigs (Blue Zones Dietary Myth). Lard, in case you didn’t know, comes from pigs. Pork and lard is found in so many diets for the simple reason pigs can live in diverse environments, from mountainous forests to tangled swamps to open fields, and they are a food source available year round.

Another thing that has gone hand in hand with loss of healthy, nutrient-dense saturated fat in the American diet is a loss of nutrition in general. It’s not only that plant foods have less minerals and vitamins because of depleted soil and because they are picked when not ripe in order to ship them long distances. The same is true of animal foods, since the animals are being fed the same crappy plant foods as us humans. But at the very least, even factory-farmed animals have far more bioavailable nutrient-density than plant foods from industrial agriculture. If we ate more fatty meat, saturated fat or otherwise, we’d be getting far more fat-soluble vitamins. But when looking at all animal foods, in particular from pasture-raised and wild-caught sources, there is no mineral or vitamin that can’t be obtained at required levels. The same can’t be said for plant foods on a vegan diet.

Back in 1991, the AHA was recommending the inclusion of lots of bread, rolls, crackers, and pasta (“made with low-fat milk and fats or oils low in saturated fatty acids” and “without eggs”); rice, beans, and peas; sugary fruits and starchy vegetables (including juices) — and deserts were fine as well. At most, eat 3 or 4 eggs a week and, as expected, optimally avoid the egg yolks where all the nutrition is located (not only fat-soluble vitamins, but also choline and cholesterol and much else; by the way, your brain health is dependent on high levels of dietary cholesterol, such that statins in blocking cholesterol cause neurocognitive decline). As long as there were little if any saturated fat and fat in general was limited, buckets of starchy carbs and sugar was considered by the AHA to be part of a healthy and balanced diet. That is sad.

This interested me because of the year. This was as I was entering young adulthood and so I was becoming more aware of the larger world. I remember the heavy-handed propaganda preaching that fiber is good and fat is evil, as if the war on obesity was a holy crusade that demanded black-and-white thinking, all subtleties and complexities must be denied in adherence to the moralistic dogma against the sins of gluttony and sloth — it was literally a evangelistic medical gospel (see Belinda Fettke’s research on the Seventh Day Adventists: Thou Shalt not discuss Nutrition ‘Science’ without understanding its driving force). In our declining public health, we were a fallen people who required a dietary clergy for our salvation. Millennia of traditional dietary wisdom and knowledge was thrown out the window as if it was worthless or maybe even dangerous.

I do remember my mother buying high-fiber cereals and “whole wheat” commercial breads (not actually whole wheat as it is simply denatured refined flour with fiber added back in). And along with this, skim or 1% fat dairy foods, especially milk, was included with every major meal and often snacks. I had sugary and starchy cereal with skim milk (and/or milk with sugary Instant Breakfast) every morning and a glass of skim milk for every dinner, maybe sometimes milk for lunch. Cheese was a regular part of the diet as well, such as with pizza eaten multiple times week or any meal with pasta, and heck cheese was a great snack all by itself, but also good combined with crackers and one could pretend to be healthy if one used Triscuits. Those were the days when I might devour a whole block of cheese, probably low-fat, in a single sitting — I was probably craving fat-soluble vitamins. Still, most of my diet was most starches and sugar, as that was my addiction. The fiber was an afterthought to market junk food as health food.

It now makes sense. When I was a kid in the 1980s, my mother says the doctor understood that whole fat milk was important for growing bodies. So that is what he recommended. But I guess the anti-fat agenda had fully taken over by the 1990s. The AHA booklet from 1991 was by then recommending “skim or 1% milk and low-fat cheeses” for all ages, including babies and children, pregnant and lactating women. Talk about a recipe for health disaster. No wonder metabolic syndrome exploded and neurocognitive health fell like a train going over a collapsed bridge. It was so predictable, as the failure of this diet was understood by many going back to earlier in the century (e.g., Weston A. Price; see my post Health From Generation To Generation).

The health recommendations did get worse over time, but to be fair it started much earlier. They had been discouraging breastfeeding for a while. Traditionally, babies were breastfed for the first couple of years or so. By the time modern America came around, experts were suggesting a short period of breast milk or even entirely using scientifically-designed formulas. My mother only breastfed me for 5-6 months and then put me on cows milk — of course, pasteurized and homogenized milk from grain-fed and factory-farmed cows. When the dairy caused diarrhea, the doctor suggested soy milk. After a while, my mother put me on dairy again, but diarrhea persisted and so for preschool she put me back on soy milk again. I was drinking soy milk off and on for many years during the most important stage of development. Holy fuck! That had to have done serious damage to my developing body, in particular my brain. Then I went from that to skim milk during another important time of development, as I hit puberty and went through growth spurts.

Early on in elementary school, I had delayed reading and a diagnosis of learning disability, seemingly along with something along the lines of either Asperger’s or specific language impairment, although undiagnosed. I definitely had social and behavioral issues, in that I didn’t understand people well when I was younger. Then entering adulthood, I was diagnosed with depression and something like a “thought disorder” or something (I forget the exact diagnosis I got while in a psychiatric ward after a suicide attempt). No doubt the latter was already present in my early neurocogntive problems, as I obviously was severely depressed at least as early as 7th grade. A malnourished diet of lots of carbs and little fat was the most probable cause for all of these problems.

Thanks, American Heart Association! Thanks for doing so much harm my health and making my life miserable for decades, not to mention nearly killing me through depression so severe I attempted suicide, and then decades of depressive struggle that followed. That isn’t even to mention the sugar and carb addiction that plagued me for so long. Now multiply my experience by that of at least hundreds of millions of other Americans, and even greater number of people from elsewhere as their governments followed the example of the United States, across the past few generations. Great job, AHA. And much appreciation for the helping hand of the USDA and various medical institutions in enforcing this anti-scientific dogma.

Let me be clear about one thing. I don’t blame my mother, as she was doing the best she could with the advice given to her by doctors and corporate media, along with the propaganda literature from respected sources such as the AHA. Nor do I blame any other average Americans as individuals, although I won’t hold back on placing the blame squarely on the shoulders of demagogues like Ancel Keys. As Gary Taubes and Nina Teicholz have made so clear, this was an agenda of power, not science. With the help of government and media, the actual scientific debate was silenced and disappeared from public view (Eliminating Dietary Dissent). The consensus in favor of a high-carb, low-fat diet didn’t emerge through rational discourse and evidence-based medicine —  it was artificially constructed and enforced.

Have we learned our lesson? Apparently not. We still see this tactic of technocratic authoritarianism, such as with corporate-funded push behind EAT-Lancet (Dietary Dictocrats of EAT-Lancet). Why do we tolerate this agenda-driven exploitation of public trust and harm to public health?

* * *

 * First quote: Passmore, R., and Y. E. Swindelis. 1963. “Observations on the Respiratory Quotients and Weight Gain of Man After Eating Large Quantities of Carbohydrates.” British Journal of Nutrition. 17. 331-39.
Second quote: Astrup, A., B. Baemann, N. . Christenson, and S. Toubre. 1994. “Failure to Increase Lipid Oxidtion in Response to Increasing Dietary Fat Content in Formerly Obese Women.” American Journal of Physiology. April, 266 (4, pt. 1) E592-99.
Both quotes are from a talk given by Peter Ballerstedt, “AHS17 What if It’s ALL Been a Big Fat Lie?,” available on the Ancestry Foundation Youtube page.

(It appears that evidence-based factual reality literally changes over time. I assume this relativity of ideological realism has something to do with quantum physics. It’s the only possible explanation. I’m feeling a bit snarky, in case you didn’t notice.)

** Americans, in the prior centuries, ate few plant foods at all because they were so difficult and time-consuming to grow. There was no way to control for pests and wild animals that often would devour and destroy a garden or a crop. It was too much investment for too little reward, not to mention extremely unreliable as a food source and so risky to survival for those with a subsistence lifestyle. Until modern farming methods, especially with 20th century industrialization of agriculture, most Americans primarily ate animal foods with tons of fat, mostly butter, cream and lard, along with a wide variety of wild-caught animal foods.

This is discussed by Nina Teicholz in The Big Fat Surprise: “Early-American settlers were “indifferent” farmers, according to many accounts. They were fairly lazy in their efforts at both animal husbandry and agriculture, with “the grain fields, the meadows, the forests, the cattle, etc, treated with equal carelessness,” as one eighteenth-century Swedish visitor described. And there was little point in farming since meat was so readily available.” (see more in my post Malnourished Americans). That puts the conventional dietary debate in an entirely different context. Teicholz adroitly dismantles the claim that fatty animal foods have increased in the American diet.

Teicholz goes on to state that, “So it seems fair to say that at the height of the meat-and-butter-gorging eighteenth and nineteenth centuries, heart disease did not rage as it did by the 1930s. Ironically—or perhaps tellingly—the heart disease “epidemic” began after a period of exceptionally reduced meat eating.” It was the discovery of seed oils that originally were an industrial byproduct, combined with Upton Sinclair’s muckraking journalism about the meatpacking industry (The Jungle), that caused meat and animal fats to quickly fall out as the foundation of the American diet. Saturated fat, in particular, had been in decline for decades prior to the epidemics of obesity, diabetes, and heart disease. Ancel Keys knew this data, which is why he had to throw out some of his data to make it fit his preconceived conclusions in promoting his preferred dietary ideology.

If we were honestly wanting to find the real culprit to blame, we would look to the dramatic rise of vegetable oils, white flour, and sugar in the 20th century diet. It began much earlier with the grain surpluses and cheap wheat, especially in England during the 1800s, but in the United States it became most noticeable in the first half century following that period. The agenda of Keys and the AHA simply made a bad situation worse, albeit much much worse.

The Creed of Ancel Keys

“From the very beginning, we had the statistical means to understand why things did not add up; we had a boatload of Cassandras, a chorus of warnings; but they were ignored, castigated, suppressed. We had our big fat villain, and we still do.”
~ Trevor Butterworth, The Wall Street Journal

“The paradox is that medicine is supposedly more enlightened, but it has never been more tyrannical, hierarchical, controlled, intolerant, and dogmatic. Working doctors who dissent are cowed because failure to comply with the medical orthodoxy threatens livelihood and registration. Much of modern medicine is an intellectual void.”
~ Dr Des Spence, Scottish GP

“The suppression of inconvenient evidence is an old trick in our profession. The subterfuge may be due to love of a beautiful hypothesis, but often enough it is due to a subconscious desire to simplify a confusing subject. It is not many years ago that the senior physician of a famous hospital was distinctly heard to remark, sotto voce, “medicine is getting so confusing nowadays, what with insulin and things.” It is a sentiment with which almost everybody who qualified more than a quarter of a century ago is likely to sympathize…. But ignoring difficulties is a poor way of solving them.”
~ Raymond Greene, in a letter to The Lancet, 1953

A popular documentary out right now is The Magic Pill. It’s about the Paleo diet with some emphasis on ketosis (low-carb consumption causing fat to be primary energy for cellular metabolism). There are several varieties of the Paleo diet, as there was much diversity in ancient dietary patterns, but there are some key commonalities.

Earlier humans ate little if any grains or beans, often even well into the agricultural period (hunting and gathering remained a mainstay of the American diet for many up into the early-to-mid 20th century, such as my mother’s family when she was growing up). In the distant past and continuing into about a century ago, it was typical to eat lots of raw, fermented, and cultured foods — including meats.

And of course, animal fats with plenty of saturated fats have always been a major food component until the past few generations. It turns out some of the healthiest populations on the planet, including the Mediterranean people, traditionally ate high levels of saturated fats. The Masai, for example, are about as carnivorous as a population can be with heavy emphasis on saturated fats and their health is amazing:

“The Masai are almost pure carnivores, eating mostly milk, blood, and meat. A Masai man drinks up to a gallon of whole milk daily, and on top of that he might also eat a lot of meat containing still more saturated fat and cholesterol. Mann expected the Masai to have high blood cholesterol but was surprised to find it was among the lowest ever measured, about 50 percent lower than that of the average American.”
(Real Food by Nina Planck, p. 61)

Interestingly, Americans too used to load up on animal-related foods and saturated fats, also with a ton of raw whole milk, cheese, and butter. It was only after decades of decline in this earlier diet that Americans began having high rates of all the major diseases that now plague us: obesity, heart disease, diabetes, etc.

This leads us to Ancel Keys, the many who promoted much of the present mainstream dietary myths. More than a half century ago, he did some research comparing diets in different regions of the world, but he did so by cherry-picking what fit his preconceptions and ignoring all else (great analysis can be found in numerous videos, articles, and books by Sally Fallon Morell and Mary Enig and at the Weston A. Price Foundation). In Nourishing Diets, Morell writes that (pp. 124-5),

“Critics have pointed out that Keys omitted from his study many areas of the world where consumption of animal foods is high and deaths from heart attack are low, including France — the so-called French paradox. But there is also a Japanese paradox. In 1989, Japanese scientists returned to the same two districts that Keys had studied. In an article titled “lessons fro Science from the Seven Countries Study,” they noted that per capita consumption of rice had declined, while consumption of fats, oils, meats, poultry, dairy products and fruit had all increased. […]

“During the postwar period of increased animal consumption, the Japanese average height increased three inches and the age-adjusted death rate from all causes declined from 17.6 to 7.4 per 1,000 per year. Although the rates of hypertension increased, stroke mortality declined markedly. Deaths from cancer also went down in spite of the consumption of animal foods.

“The researchers also noted — and here is the paradox — that the rate of myocardial infarction (heart attack) and sudden death did not change during this period, in spite of the fact that the Japanese weighed more, had higher blood pressure and higher cholesterol levels, and ate more fat, beef and dairy foods.”

About the Mediterranean diet, Morell considers the historical context to Keys’ study:

“The question that the believers haven’t asked themselves is this: was the lean, so-called Mediterranean diet they observed after World War II the true Mediterranean diet? Or were they observing the tail end of deprivation engendered by half a decade of conflict? Were the inhabitants of Crevalcore and Montegiorgio abandoning the traditional diet, or were they taking it up again? And did Keys miss the sight of Italians enjoying rich food in the early 1950s because Italians had never done such a shameful thing, or was the visiting professor too poor at the time to afford anything more than plain pizza in a sidewalk cafe?” (pp. 157-8)

Morell then goes on to look at numerous historical texts, including early cookbooks, from the region. All the evidence points to the traditional Mediterranean diet consisting largely of whole fat dairy products, meat products (lots of sausage), oils and animal fats, and eggs. As emphasized in the paleo diet,

“Italians love their vegetables for sure, and that’s because they know how to make them taste good. They know that salads taste better with a good dressing of aged vinegar and olive oil; and cooked vegetables blossom when anointed with butter, lard or cream” (p. 160).

Keys didn’t really understand the societies he was studying, much less the societies he chose to ignore. Yet he was charismatic and, though other contemporary research contradicted his data, he was able to promote his views such that they became adopted as mainstream ideology. This new belief system was enforced by the US government and by corporations, often in heavy-handed ways. Adelle Davis was a biochemist and nutritionist who was inspired by Weston A. Price’s research on traditional diets. In response, as described Joann Grohman, “The FDA raided health food stores and seized her books under a false labeling law because they were displayed next to vitamin bottles” (Real Food by Nina Planck, p. 30). “I find it dismaying that,” Planck says in another section (p. 201),

“the dangers of trans fats were known for sixty years. Weston Price cited 1943 research that butter was better than hydrogenated cottonseed oil. In the 1950s, researchers guessed that hydrogenated vegetable oil led to heart disease. Ancel Keys, the proponent of monounsaturated fat, showed in 1961 that hydrogenated corn oil raised trigydcerides more than butter. Year after year, the bad news piled up. [So, even Keys ultimately knew that saturated fat wasn’t the real culprit.]

“One dogged researcher, Mary Enig, helped get the word out. The author of Know Your Fats, Enig waged an often lonely battle. I’m afraid her efforts were not always welcomed with bouquets of roses. In 1978, Enig wrote a scientific paper challenging a government report blaming saturated fat for cancer, in which she pointed out that the data actually showed a link with trans fats. Not long after, “two guys from the Instituted of Shortening and Edible Oil — the trans fat lobby, basically — visited me, and oh boy, were they angry,” Enig told Gourmet magazine. “They said they’d been keeping a careful watch to prevent articles like mine from coming out and didn’t know how this horse had gotten out of the barn.”

“The stakes were high. “We spent lots of time, and lots of money and energy, refuting this work,” said Dr. Lars Wiederman, who once worked for the American Soybean Association. “Protecting trans fats from the taint of negative scientific findings was our charge.””

That sounds a lot like the corporatist defense of profits as happened with the decades of lies, spin, and obfuscation pushed by the tobacco and oil companies. Another more recent example is given in The Magic Pill documentary. In South Africa, the government put a doctor on trial for daring to give dietary advice that was in line with millennia-old traditions of human eating habits — fortunately, the doctor won his case but only after the government spent immense amount of taxpayer money trying to destroy him.

Dominant paradigms die hard and only after an immense fight, backed by the full power of the government and millions of corporate dollars. But that is only one part of what slows down change. Ideologies as worldviews hold on so long because they become entrenched in our minds and cultures. As often is noted, old scientists (along with old doctors, professors, bureaucrats, etc) don’t change their minds but eventually die and are replaced by a new generation with new ideas.

This was demonstrated with Michael Pollan’s latest documentary, In Defense of Food (transcript). In it, the professor of nutrition Marion Nestle adds a note of caution: “And it should be written on every single epidemiological study, ‘Red flag, association does not necessarily mean causation.’” Does that stop Pollan from basing conclusions on Keys problematic research? Nope. Instead, he promotes the belief that Keys’ conclusions are still valid: “But based on the strong association Keys saw in his data between heart disease and saturated fat, he advised people to eat less of it.” Not a single mention of any doubt or criticism.

It might be noted that Pollan was born in 1955. That was right in the middle of this now dominant ideology coming into ascendance. He reached adulthood as Keys’ ideology was being promoted by the USDA and as it became the new creed in mainstream thought. Now in his sixties, he is one of the older generation still clinging to what they were taught growing up. Yet, as a Boomer, his influence is still at its peak. Despite all the Western ailments, conventional medicine has allowed people to live longer and that means ideologies will remain entrenched for longer.

It’s going to be an uphill battle for younger generations to challenge the status quo. But the shift is already happening. From a personal perspective, this time lag of common knowledge creates a sense of disorientation, as it will take at least decades for official advice and public opinion to catch up with the research that has been accumulating over this past century.

This point was emphasized for me in reading a book published two decades ago in 1998, The Fats of Life by Caroline M. Pond — the author, a mainstream academic and researcher, notes that, “Heart attacks are thus seen as arising from a deficiency of polyunsaturated fatty acids rather than from an excess of saturates of cholesterol” (p. 293). This is far from being new knowledge. Pond doesn’t mention Weston A. Price, but she does discuss “the Oxford physician and biochemist, Hugh Sinclair (1910-1990), who studied the diet and habits of the Eskimos in northern Canada in 1944. Sinclair noted that Eskimos rarely suffered from the heart disease or strokes in spite of a very high-fat diet that included reindeer meat.” She goes onto say that, “The Masai people of Kenya eat large quantities of ruminant milk and meat, and Jamaicans eat saturated fats in coconut oil, but few of them die from heart attacks.”

In The Magic Pill, it is pointed out that Americans have been following the USDA Food Pyramid in eating less red meat and saturated fats while eating more grains, legumes, vegetables, and fruits. More Americans have been eating as they were told. What has resulted of this drastic dietary change? All the diseases this diet is supposed to prevent have gotten worse. This stark reality has yet to sink in because it would require thousands of officials and authority figures to not only admit they were wrong but that they caused immense harm to so many.

But why do others continue on with the sham? We’ve known much of this info for a long time now. Why are we still debating it as if the conventional view still has any relevance?

* * *

About silencing the critics:

Good Calories, Bad Calories
by Gary Taubes
pp. 191-194

This is where the story now takes some peculiar turns. One immediate effect of the revelation about HDL, paradoxically, was to direct attention away from triglycerides, and with them the conspicuous link, until then, to the carbohydrate hypothesis. Gordon and his colleagues had demonstrated that when both HDL and triglycerides were incorporated into the risk equations of heart disease, or when obesity and the prediabetic condition of glucose intolerance were included in the equations along with triglycerides, the apparent effect of triglycerides diminished considerably. This result wasn’t surprising, considering that low HDL, high triglycerides, obesity, and glucose intolerance all seemed to be related, but that wasn’t the point. The relevant question for physicians was whether high triglycerides by themselves caused heart disease. If so, then patients should be advised to lower their triglycerides, however that might be accomplished, just as they were being told already to lower cholesterol. These risk-factor equations (known as multivariate equations ) suggested that triglycerides were not particularly important when these other factors were taken into account, and this was how they would be perceived for another decade. Not until the late 1980s would the intimate association of low HDL, high triglycerides, obesity, and diabetes be considered significant—in the context of Gerald Reaven’s Syndrome X hypothesis—but by then the heart-disease researchers would be committed to the recommendations of a national low-fat, high-carbohydrate diet.

Heart-disease researchers would also avoid the most obvious implication of the two analyses—that raising HDL offers considerably more promise to prevent heart disease than lowering either LDL or total cholesterol—on the basis that this hadn’t been tested in clinical trials. Here the immediate obstacle, once again, was the institutional investment in Keys’s hypothesis. The National Institutes of Health had committed its heart-disease research budget to two ongoing studies, MRFIT and the Lipid Research Clinics Trial, which together would cost over $250 million. These studies were dedicated solely to the proposition that lowering total cholesterol would prevent heart disease. There was little money or interest in testing an alternative approach. Gordon later recalled that, when he presented the HDL evidence to the team of investigators overseeing MRFIT, “it was greeted with a silence that was very, how should I say it, expressive. One of them spoke up indicating he suspected this was a bunch of shit. They didn’t know how to deal with it.”

Indeed, the timing of the HDL revelations could not have been less convenient. The results were first revealed to the public in an American Heart Association seminar in New York on January 17, 1977. This was just three days after George McGovern had announced the publication of the Dietary Goals for the United States, advocating low-fat, high-carbohydrate diets for all Americans, based exclusively on Keys’s hypothesis that coronary heart disease was caused by the effect of saturated fat on total cholesterol. If the New York Times account of the proceedings is accurate, the AHA and the assembled investigators went out of their way to ensure that the new evidence would not cast doubt on Keys’s hypothesis or the new dietary goals. Rather than challenge the theory that excess cholesterol can cause heart disease, the Times reported, “the findings re-emphasize the importance of a fatty diet in precipitating life-threatening hardening of the arteries in most Americans,” which is precisely what they did not do. According to the Times, saturated fat was now indicted not just for increasing LDL cholesterol, which it does, but for elevating VLDL triglycerides and lowering HDL, which it does not, and certainly not compared with the carbohydrates that McGovern’s Dietary Goals were recommending all Americans eat instead.

In a more rational world, which means a research establishment not already committed to Keys’s hypothesis and not wholly reliant on funding from the institutions that had embraced the theory, the results would have immediately prompted small clinical trials of the hypothesis that raising HDL prevented heart disease, just like those small trials that had begun in the 1950s to test Keys’s hypothesis. If those confirmed the hypothesis, then longer, larger trials would be needed to establish whether the short-term benefits translated to a longer, healthier life. But the NIH administrators decided that HDL studies would have to wait. Once the Lipid Research Clinics Trial results were published in 1984, they were presented to the world as proof that lowering cholesterol by eating less fat and more carbohydrates was the dietary answer to heart disease. There was simply no room now in the dogma for a hypothesis that suggested that raising HDL (and lowering triglycerides) by eating more fat and less carbohydrates might be the correct approach. No clinical trials of the HDL hypothesis would begin in the U.S. until 1991, when the Veterans Administration funded a twenty-center drug trial. The results, published in 1999, supported the hypothesis that heart disease could be prevented by raising HDL. The drug used in the study, gemfibrozil, also lowered triglyceride levels and VLDL, suggesting that a diet that did the same by restricting carbohydrates might have a similarly beneficial effect. As of 2006, no such dietary trials had been funded. Through the 1980s and 1990s, as our belief in the low-fat heart-healthy diet solidified, the official reports on nutrition and health would inevitably discuss the apparent benefits of raising HDL—the “good cholesterol”—and would then observe correctly that no studies existed to demonstrate this would prevent heart disease and lengthen life. By 2000, well over $1 billion had been spent on trials of cholesterol-lowering, and a tiny fraction of that amount on testing the benefits of raising HDL. Thus, any discussions about the relative significance of raising HDL versus lowering total cholesterol would always be filtered through this enormous imbalance in the research efforts. Lowering LDL cholesterol would always have the appearance of being more important.

pp. 212-214

Reaven’s 1988 Banting Lecture is credited as the turning point in the effort to convince diabetologists of the critical importance of insulin resistance and hyperinsulinemia, but those investigators concerned with the genesis of heart disease paid little attention, considering anything having to do with insulin to be relevant only to diabetes. This was a natural consequence of the specialization of scientific research. Through the mid-1980s, Reaven’s research had focused on diabetes and insulin, and so his publications appeared almost exclusively in journals of diabetes, endocrinology, and metabolism. Not until 1996 did Reaven publish an article on Syndrome X in the American Heart Association journal Circulation, the primary journal for research in heart disease. Meanwhile, his work had no influence on public-health policy or the public’s dietary consciousness. Neither the 1988 Surgeon General’s Report on Nutrition and Health nor the National Academy of Sciences’s 1989 Diet and Health mentioned insulin resistance or hyperinsulinemia in any context other than Reaven’s cautions that high-carbohydrate diets might not be ideal for Type 2 diabetics. Both reports ardently recommended low-fat, high-carbohydrate diets for the prevention of heart disease.

Even the diabetes community found it easier to accept Reaven’s science than its dietary implications. Reaven’s observations and data “speak for themselves,” as Robert Silverman of the NIH suggested at a 1986 consensus conference on diabetes prevention and treatment. But they placed nutritionists in an awkward position. “High protein levels can be bad for the kidneys,” said Silverman. “High fat is bad for your heart. Now Reaven is saying not to eat high carbohydrates. We have to eat something.” “Sometimes we wish it would go away,” Silverman added, “because nobody knows how to deal with it.”

This is what psychologists call cognitive dissonance, or the tension that results from trying to hold two incompatible beliefs simultaneously. When the philosopher of science Thomas Kuhn discussed cognitive dissonance in scientific research—“the awareness of an anomaly in the fit between theory and nature”—he suggested that scientists will typically do what they have invariably done in the past in such cases: “They will devise numerous articulations and ad hoc modifications of their theory in order to eliminate any apparent conflict.” And that’s exactly what happened with metabolic syndrome and its dietary implications. The syndrome itself was accepted as real and important; the idea that it was caused or exacerbated by the excessive consumption of carbohydrates simply vanished.

Among the few clinical investigators working on heart disease who paid attention to Reaven’s research in the late 1980s was Ron Krauss. In 1993, Krauss and Reaven together reported that small, dense LDL was another of the metabolic abnormalities commonly found in Reaven’s Syndrome X. Small, dense LDL, they noted, was associated with insulin resistance, hyperinsulinemia, high blood sugar, hypertension, and low HDL as well. They also reported that the two best predictors of the presence of insulin resistance and the dominance of small, dense LDL are triglycerides and HDL cholesterol—the higher the triglycerides and the lower the HDL, the more likely it is that both insulin resistance and small, dense LDL are present. This offers yet another reason to believe the carbohydrate hypothesis of heart disease, since metabolic syndrome is now considered perhaps the dominant heart-disease risk factor—a “coequal partner to cigarette smoking as contributors to premature [coronary heart disease],” as the National Cholesterol Education Program describes it—and both triglycerides and HDL cholesterol are influenced by carbohydrate consumption far more than by any fat.

Nonetheless, when small, dense LDL and metabolic syndrome officially entered the orthodox wisdom as risk factors for heart disease in 2002, the cognitive dissonance was clearly present. First the National Cholesterol Education Program published its revised guidelines for cholesterol testing and treatment. This was followed in 2004 by two conference reports: one describing the conclusions of a joint NIH-AHA meeting on scientific issues related to metabolic syndrome, and the other, in which the American Diabetes Association joined in as well, describing joint treatment guidelines. Scott Grundy of the University of Texas was the primary author of all three documents. When I interviewed Grundy in May 2004, he acknowledged that metabolic syndrome was the cause of most heart disease in America, and that this syndrome is probably caused by the excessive consumption of refined carbohydrates. Yet his three reports—representing the official NIH, AHA, and ADA positions—all remained firmly wedded to the fat-cholesterol dogma. They acknowledge metabolic syndrome as an emerging risk factor for heart disease, but identify LDL cholesterol as “the primary driving force for coronary atherogenesis.” Thus, heart disease in America, as the National Cholesterol Education Program report put it, was still officially caused by “mass elevations of serum LDL cholesterol result[ing] from the habitual diet in the United States, particularly diets high in saturated fats and cholesterol.”

There was no mention that carbohydrates might be responsible for causing or exacerbating either metabolic syndrome or the combination of low HDL, high triglycerides, and small, dense LDL, which is described as occurring “commonly in persons with premature [coronary heart disease]. *53 In the now established version of the alternative hypothesis—that metabolic syndrome leads to heart disease—the carbohydrates that had always been considered the causative agent had been officially rendered harmless. They had been removed from the equation of nutrition and chronic disease, despite the decades of research and observations suggesting the critical causal role they played.

The Big Fat Surprise
by Nina Teicholz
pp. 57-58

It’s not that no one questioned Keys along the way, of course. There were plenty of skeptics, including esteemed, influential scientists. Remember that Swedish egg-eating doctor, Uffe Ravnskov? On my own travels through the world of nutrition as I researched this book, he was the first “skeptic” I met. Whereas once a large and prominent group of scientists had opposed Keys and his hypothesis, the great majority of them had disappeared by the late 1980s. Ravnskov picked up their torch later, with the publication of a book called Cholesterol Myths in 2000.

At a conference that we were both attending near Copenhagen in 2005, he stood out in the crowd simply because he was willing to confront this gathering of top nutrition experts by asking questions that were considered long since settled.

“The whole pathway, from cholesterol in the diet, to cholesterol in the blood, to heart disease—has this pathway really been proven?” he stood up and asked, rightly though rhetorically, after a presentation one day.
“Tsh! Tsh! Tsh!” A hundred-plus scientists wagged their heads in unison.
“Next question?” asked an irritated moderator.

The incident illustrated, for me, the most remarkable aspect of the nutrition research community, namely its surprising lack of oxygen for alternative viewpoints. When I started out my research, I expected to find a community of scientists in decorous debate. Instead, I found researchers like Ravnskov, who, by his own admission, was a cautionary tale for independently minded scientists seeking to challenge the conventional wisdom. His predecessors from the 1960s onward hadn’t been convinced by the orthodoxy on cholesterol; they’d just been silenced, worn out, or had come to the end of their careers. As Keys’s ideas spread and became adopted by powerful institutions, those who challenged him faced a difficult—some might say impossible—battle. Being on the losing side of such a high-stakes debate had caused their professional lives to suffer. Many of them had lost jobs, research funding, speaking engagements, and all the many other perks of prestige. Although these diet-heart opponents included a number of researchers who were at the top of their fields, including, notably, an editor of the Journal of the American Medical Association , they were not invited to conferences and were unable to get prestigious journals to publish their work. XIV Experiments that had dissenting results, they found, were not debated and discussed but instead dismissed or ignored altogether. Even being subject to slander and personal ridicule were surprisingly not unusual experiences for these opponents of the diet-heart hypothesis. In short, they found themselves unable to continue contributing to their fields, which of course is the very essence of every scientist’s hopes and ambitions.

To a surprising degree, in fact, the story of nutritional science is not, as we would expect, one of sober-minded researchers moving with measured, judicious steps. It falls, instead, under the “Great Man” theory of history, whereby strong personalities steer events using their own personal charisma, intelligence, wisdom, or wits. In the history of nutrition, Ancel Keys was, by far, the Greatest Man.

pp. 106-108

On the whole, said Manning Feinleib, an associate director at the NHLBI who attended the meetings as a rapporteur, the committee seemed to consider the downside of cancer to be less important than the upside of reducing heart disease. I spoke to him in 2009, and he was clearly dismayed that the issue of low cholesterol and cancer had still not been settled. “Oh boy, it’s been more than twenty-five years, and they have still not shed more light on what’s going on, and why not? That’s even more puzzling.”

In 1990, the NHLBI held yet another meeting on the problem of “significantly increased” death rates from cancer and other noncardiovascular causes for people with low cholesterol. The lower the cholesterol, the worse it looked for cancer deaths, and damningly, it looked especially bad for healthy men who were actively trying to reduce their cholesterol through diet or drugs. But there was no follow-up to these meetings, and the results did not change the enthusiasm for the “prudent diet.” The effects of low cholesterol are still not well understood.

When I mentioned all this to Stamler, he didn’t remember any part of this cancer-cholesterol debate. In this way, he is a microcosm of a larger phenomenon that allowed the diet-heart hypothesis to move forward: inconvenient results were consistently ignored; here again, “selection bias” was at work.

An Extreme Case of Selection Bias

There has been a lot of selective reporting and ignoring of the methodological problems over the years. But probably the most astonishing example of selection bias was the near-complete suppression of the Minnesota Coronary Survey, which was an outgrowth of the National Diet Heart Study. Also funded by NIH, the Minnesota Coronary Survey is the largest-ever clinical trial of the diet-heart hypothesis and therefore certainly belongs on the list along with Oslo, the Finnish Mental Hospital Study, and the LA Veterans Trial, but it is rarely included, undoubtedly because it didn’t turn out the way nutrition experts had hoped.

Starting in 1968, the biochemist Ivan Frantz fed nine thousand men and women in six Minnesota state mental hospitals and one nursing home either “traditional American foods,” with 18 percent saturated fat, or a diet containing soft margarine, a whole-egg substitute, low-fat beef, and dairy products “filled” with vegetable oil. This diet cut the amount of saturated fat in half. (Both diets had a total of 38 percent fat overall.) Researchers reported “nearly 100% participation,” and since the population was hospitalized, it was more controlled than most—although, like the Finnish hospital study, there was a good deal of turnover in the hospital (the average length of stay was only about a year).

After four-and-a-half years, however, the researchers were unable to find any differences between the treatment and control groups for cardiovascular events, cardiovascular deaths, or total mortality. Cancer was higher in the low-saturated-fat group, although the report does not say if that difference was statistically significant. The diet low in saturated fat had failed to show any advantage at all. Frantz, who worked in Keys’s university department, did not publish the study for sixteen years, until after he retired, and then he placed his results in the journal Arteriosclerosis, Thrombosis, and Vascular Biology , which is unlikely to be read by anyone outside the field of cardiology. When asked why he did not publish the results earlier, Frantz replied that he didn’t think he’d done anything wrong in the study. “We were just disappointed in the way it came out,” he said. In other words, the study was selectively ignored by its own director. It was another inconvenient data point that needed to be dismissed.

pp. 114-

In the United States, Pete Ahrens, who was still the prudent diet’s most prominent critic, continued to publish his central point of caution: the diet-heart hypothesis “is still a hypothesis . . . I sincerely believe we should not . . . make broadscale recommendations on diets and drugs to the general public now.” XVIII

By the late 1970s, however, the number of scientific studies had grown to such “unmanageable proportions,” as one Columbia University pathologist put it, that it was overwhelming. Depending on how one interpreted the data and how one weighed all the caveats, the dots could be connected to point in different directions. The ambiguities inherent to nutrition studies opened the door for their interpretation to be influenced by bias—which hardened into a kind of faith. There were simply “believers” and “nonbelievers,” according to cholesterol expert Daniel Steinberg. A number of interpretations of the data were possible and equally compelling from a scientific perspective, but there was only one for “believers,” while “disbelievers” became heretics outside the establishment.

Thus, the normal defenses of modern science had been flattened by a perfect storm of forces gathered in postwar America. In its impressionable infancy and compelled by an urgent drive to cure heart disease, nutrition science had bowed to charismatic leaders. A hypothesis had taken center stage; money poured in to test it, and the nutrition community embraced the idea. Soon there was very little room for debate. The United States had embarked upon a giant nutritional experiment to cut out meat, dairy, and dietary fat altogether, shifting calorie-consumption over to grains, fruits, and vegetables. Saturated animal fats would be replaced by polyunsaturated vegetable oils. It was a new, untested diet—just an idea, presented to Americans as the truth. Many years later, science started to show that this diet was not very healthy after all, but it was too late by then, since it had been national policy for decades already.

pp. 142-145

The Consensus Conference

If a large portion of middle-aged American adults are now cutting back on meat and taking statin pills, it is due almost entirely to the step that the NHLBI took next. Dispensing drugs and dietary advice to the entire US population is a huge responsibility, and the NHLBI decided it needed to create a scientific consensus, or at least the appearance of one, before moving forward. Also, the agency needed to define the exact cholesterol thresholds above which it could tell doctors to prescribe a low-fat diet or a statin. So once again, in 1984, NHLBI convened an expert group in Washington, DC, with a public meeting component attended by more than six hundred doctors and researchers. Their job—in an unrealistic two-and-a-half days—was to grapple with and debate the entire, massive stack of scientific literature on diet and disease, and then to come to a consensus about the recommended cholesterol targets for men and women of all ages.

The conference was described by various attendees as having preordained results from the start, and it’s hard not to conclude otherwise. The sheer number of people testifying in favor of cholesterol lowering was larger than the number of spaces allotted to challengers, and powerful diet-heart supporters controlled all the key posts: Basil Rifkind chaired the planning committee, Daniel Steinberg chaired the conference itself, and both men testified.

The conference “consensus” statement, which Steinberg read out on the last morning of the event, was not a measured assessment of the complicated role that diet might play in a little-understood disease. Instead, there was “no doubt,” he stated, that reducing cholesterol through a low-fat, low-saturated-fat diet would “afford significant protection against coronary heart disease” for every American over the age of two. Heart disease would now be the most important factor driving dietary choices for the entire nation. After the conference, in March 1984, Time magazine ran an illustration on its cover of a face on a dinner plate, comprised of two fried-egg eyes over a bacon-strip frown. “Hold the Eggs and Butter!” stated the headline, and the story began: “Cholesterol is proved deadly, and our diet may never be the same.”

As we’ve seen, LRC had nothing to say about diet, and even its conclusions on cholesterol were only weakly supported by the data, but Rifkind had already demonstrated that he believed this extrapolation was fair. He told Time that the results “strongly indicate that the more you lower cholesterol and fat in your diet, the more you reduce the risk of heart disease.”

Gina Kolata, then a reporter for Science magazine, wrote a skeptical piece about the quality of the evidence supporting the conference’s conclusions. The studies “do not show that lowering cholesterol makes a difference,” she wrote, and she quoted a broad range of critics who worried that the data were not nearly strong enough to recommend a low-fat diet for all men, women, and children. Steinberg attempted to dismiss the criticisms by calling her article a case of the media’s appetite for “dissent [which] is always more newsworthy than consensus,” but the Time cover story in support of Steinberg’s stated conclusions was clearly an example of the opposite, and on the whole, the media supported the new cholesterol guidelines.

The consensus conference spawned an entirely new administration at the NIH, called the National Cholesterol Education Program (NCEP), whose job it remains to advise doctors about how to define and treat their “at-risk” patients, as well as to educate Americans themselves about the apparent advantages of lowering their cholesterol. In the following years, the NCEP’s expert panels became infiltrated by researchers supported by pharmaceutical money, and cholesterol targets were ratcheted ever lower, thereby bringing greater and greater numbers of Americans into the category that qualified for statins. And the low-fat diet, even though it had never been properly tested in a clinical trial to ascertain whether it could prevent heart disease, became the standard, recommended diet of the land.

For longtime critics of the diet-heart hypothesis such as Pete Ahrens, the consensus conference was also significant because it marked the last time they could speak openly. After this conference, Ahrens and his colleagues were forced to fold their case. Although members of the nutrition elite had, over the previous two decades, been allowed to be part of the debate, in the years following the consensus conference, this was no longer true. To be a member of the elite now meant, ipso facto, supporting the low-fat diet. So effectively did the NHLBI-AHA alliance silence its antagonists, in fact, that among the tens of thousands of researchers in the worlds of medicine and nutrition over the next fifteen years, only a few dozen would publish research even gingerly challenging the diet-heart hypothesis. And even then, they worried about putting their careers on the line. They saw Ahrens, who had risen to the very top of his field and yet found himself having a hard time getting grants, because there was “a price to pay for going up against the establishment, and he was well aware of that,” as one of his former students told me.

No doubt this is why Ahrens, in looking back on the conference, which came to be his swan song, spoke with an uncharacteristic lack of reserve. “I think the public is being hosed by the NIH and the American Heart Association,” he declared. “They desire to do something good. They’re hoping to God that this is the right thing to do. But they are not acting on the basis of scientific evidence, but on the basis of a plausible but untested idea.” Plausible or even probable, however, that untested idea had now been launched.

pp. 319-328

These pioneering researchers of the Atkins diet continued to expand their work throughout the 2000s, conducting trials on a range of subjects: men and women, athletes, and those suffering from obesity, diabetes, and metabolic syndrome. XV XVI And while the gains have varied, they have consistently pointed in the right direction. One of the more extraordinary experiments involved 146 men suffering from high blood pressure who went on the Atkins diet for almost a year. The group saw their blood pressure drop significantly more than did a group of low-fat dieters—who were also taking a blood-pressure medication.

In most of these experiments, the diet with the best results contained more than 60 percent of calories as fat. XVII This proportion of fat was similar to what the Inuit and the Masai ate but was startlingly high compared to the official recommendations of 30 percent or less. Yet no other well-controlled trials of any other diet had ever shown such clear-cut advantages in the fight against obesity, diabetes, and heart disease, and for so many different kinds of populations.

Despite the consistency of these results, Westman and his colleagues have remained outsiders in the world of nutrition. Their work has perhaps predictably been met with silence, scorn, or both. Getting their research published in prestigious journals has been difficult, and invitations to major conferences are rare. Volek says that even when he’s been invited to present his findings at meetings, displaying research that confronts the very foundation of the conventional wisdom on diet, the reception is incurious: “people are just quiet.” And despite the substantial body of evidence now supporting the high-fat, low-carbohydrate regime as the healthiest option, his colleagues still routinely refer to the diet as “quackery” and a “fad.” Persevering in this field can be dispiriting, Volek told me. “You do deal with bias. . . . It’s very difficult to find grant money or journals that want to publish our studies.”

Westman has written poignantly about the predicament of working toward paradigm change when the existing bias is so strong: “When an unscientific fear of dietary fat pervades the culture so much that researchers who are on study sections that provide funding will not allow research into high-fat diets for fear of ‘harming people,’ ” as we’ve seen at the NIH and AHA, “this situation will not allow science to ‘self-correct.’ A sort of scientific taboo is created because of the low likelihood of funding, and the funding agencies are off the hook because they say that researchers are not submitting requests for grants.” […]

Gary Taubes and “The Big Fat Lie”

While these researchers have been ignored by most mainstream medical and nutrition communities, the one person who has successfully redirected the nutrition conversation over the past decade toward the idea that carbohydrates, not fat, are the drivers of obesity and other chronic diseases is the science journalist Gary Taubes. In 2001, he wrote a critical history of the diet-heart hypothesis for Science magazine, which was the first time a major scientific journal had published a thorough analysis of the low-fat dogma’s scientific weaknesses—at least since Pete Ahrens had ceded the battle against Ancel Keys in the mid-1980s. Taubes also reviewed all the science, from those prewar German and Austrian obesity researchers on through Pennington, and concluded that obesity was indeed a hormonal defect and not the result of gluttony and sloth. In his Science piece, Taubes described how the hormone causing obesity is most likely insulin, which spikes when one eats carbohydrates. One of his primary conclusions, in fact, was that dietary fat itself is the nutrient least likely to make you fat, because it’s the one macronutrient that doesn’t stimulate the production of insulin.

Other researchers and scientists had published critiques of the diet-heart hypothesis, but Taubes was the first to put together all the various ideas on the topic into one comprehensive narrative. And Taubes could reach a national audience. He followed up with a second foray in the New York Times Magazine , under the headline, “ What if It’s All Been a Big Fat Lie?” In 2007, he published a book on the subject, Good Calories, Bad Calories , a densely annotated and meticulously researched work that made a comprehensive and original case for an “alternative” hypothesis on obesity and chronic disease. It argued that the refined carbohydrates and sugars in our diet are what cause obesity, diabetes, and related diseases, and not the dietary fat or the “excess calories” that are thought to come from eating more than we should.

Taubes has been the most influential recent challenger to the diet-heart hypothesis. Even Michael Pollan, the popular food writer who says we should eat “mostly plants,” praised Taubes for exposing the pseudoscience in the low-fat dogma and dubbed him the Alexander Solzhenitsyn of the nutrition world.

Taubes’s work shattered dogma to such an extent that most nutrition experts have been unable to respond except by simply dismissing him, as the field has managed to do with challengers so many times before. When Taubes’s book came out, Gina Kolata, medical writer for the New York Times , called Taubes “a brave and bold science journalist” but ended her review with an airy, “I’m sorry, I’m not convinced.” XXIII The chill in the nutrition community toward Taubes was so palpable in the mid-2000s, when I started my own research for this book, that although many diet-and-heart experts had apparently read Taubes, I found that no one was willing to talk about him. Taubes’s work as a science journalist had won him many awards, including three science-in-society awards from the National Association of Science Writers, the most that the group allows for any single science reporter. Yet roughly two thirds of my interviews with nutrition experts began with something like: “If you are taking the Gary Taubes line, then I’d rather not talk to you.”

Taubes, in turn, was a provocative critic of nutrition science and its practitioners. After one talk at a research institute, a senior faculty member asked, “Mr. Taubes, is it fair to say that one subtext of your talk is that you think we’re all idiots?” “A surprisingly good question,” Taubes wrote later on his blog. He explained that generations of researchers weren’t unintelligent; they had simply been educated into a biased way of thinking. Yet if the pursuit of science is about getting the right answer, wrote Taubes, then “getting the wrong answer on such a huge and tragic scale borders on inexcusable.” In the last line of his 2002 New York Times Magazine article, he quotes a researcher asking the not-so-rhetorical question: “Can we get the low-fat proponents to apologize?”

Despite the no-love-lost nature of the relationship between Taubes and mainstream nutrition experts, much of what he wrote seemed so eminently believable that it was almost immediately adopted. Of course sugar and white flour were bad! Nutrition experts spoke as if this had always been known. A 2010 headline in the Los Angeles Times declared, “Fat Was Once the Devil. Now More Nutritionists Are Pointing Accusingly at Sugar and Refined Grains.” Researchers around the country who had read and digested Taubes’s work were suddenly studying sucrose, fructose, and glucose, comparing them to each other and looking at their insulin effects. Some investigators have made the case recently that the fructose found in fruits, honey, table sugar, and high-fructose corn syrup may be worse than glucose in provoking the inflammation markers linked to heart disease. XXIV The glucose found in sugar and starchy vegetables, meanwhile, seems to work more closely with insulin to cause obesity. The science on these different types of refined carbohydrates is still in its infancy, so we don’t really know if all carbohydrates play a role in obesity, diabetes, and heart disease, or if some types are worse than others.

The one statement that seems safe to make is that the refined carbohydrates and sugars that we were recommended to eat by the AHA as part of a healthy, fat-avoiding diet, are not merely indifferent, “empty calories,” as we’ve long been told, but are actively bad for health in a variety of ways. XXV Moreover, the clinical trials in recent years imply that any kind of carbohydrate, including those in whole grains, fruits, and starchy vegetables, are also unhealthy in large amounts. Remember that the Shai study in Israel found that the Mediterranean diet group, eating a high proportion of calories as these “complex” carbohydrates, turned out to be less healthy and fatter than the group on the Atkins-style diet, although they were healthier than the low-fat alternative. The Women’s Health Initiative, too, in which some 49,000 women were tested on a diet high in complex carbohydrates for nearly a decade, showed only marginal reductions in disease risk or weight. This big-picture message about how even too many unrefined carbohydrates might be bad for health is alienating for Americans, however, since we are now used to viewing these foods as healthy. And no doubt it would be difficult for nutrition experts to contradict their own half-century’s worth of high-carbohydrate advice.

Even so, whatever scientific progress has been made toward our greater understanding of carbohydrates generally in recent years has clearly been due to Taubes’s work. “This has been his most important contribution to the field,” said Ronald M. Krauss, an influential nutrition expert and the director of research at the Children’s Hospital Oakland Research Institute. For a journalist, it was an astonishing coup in the world of science. In 2013, Taubes became one of the rare journalists to write a peer-reviewed article for the highly respected scientific publication, the British Medical Journal . Yet given the stranglehold that Keys’s ideas have held on nutrition researchers for so many decades, it is perhaps inevitable that an alternative hypothesis had to come from an outsider.

Lore of Nutrition
by Tim Noakes & Marika Sboros
pp. 27-31
Preface by Marika Sboros

He explained that there was nothing new to what he was saying, that the evidence had been there for years, and that those in positions of power and influence over public nutrition advice had either ignored or suppressed this evidence. He directed me to scientific people, papers and places I didn’t even know existed.

I ended the conversation feeling unsettled. Noakes sounded eminently rational, reasonable and robustly scientific. I started reading all the references he gave me. I read the work of US physician-professors Stephen Phinney and Eric Westman, and Professor Jeff Volek. I read Eades; US science journalist Gary Taubes, author of Good Calories, Bad Calories and Why We Get Fat (and most recently The Case Against Suga r ); and one of British obesity researcher Dr Zoë Harcombe’s many books, The Obesity Epidemic . I also read The Big Fat Surprise by US investigative journalist Nina Teicholz. That book thoroughly rocked my scientific worldview, as it has done for countless others.

The Wall Street Journal said of Teicholz’s book: ‘From the very beginning, we had the statistical means to understand why things did not add up; we had a boatload of Cassandras, a chorus of warnings; but they were ignored, castigated, suppressed. We had our big fat villain, and we still do.’ Former editor of the British Medical Journal Dr Richard Smith wrote about The Big Fat Surprise in a feature for the journal in 2014, titled ‘Are some diets “mass murder”?’ LCHF critics have suggested that prescribing a diet restricted in carbohydrates to the public is ‘the equivalent of mass murder’. Smith gained a very different impression after ploughing through five books on diet and some of the key studies to write his feature. The same accusation of ‘mass murder’ can be directed at ‘many players in the great diet game’, Smith said. In short, he said, experts have based bold policies on fragile science and the long-term results ‘may be terrible’. 3

For her book, Teicholz researched the influential US dietary guidelines, which were introduced in 1977 and which most English-speaking countries, including South Africa, subsequently adopted. She discovered that there was no evidence to support the guidelines’ low-fat, high-carb recommendations when they were first introduced, and that any evidence to the contrary was ignored or suppressed for decades.

My research into LCHF left me uneasy. As a journalist, I’m a messenger. I began to wonder whether I had been giving the wrong messages to my readers for decades. Had I unwittingly promoted advice that harmed people suffering from obesity, diabetes and heart disease? Among those was my father, Demetrius Sboros, who suffered from heart disease for many years before his death in 2002. Had I given him advice and information that shortened his life?

I put those worries aside and wrote up my interview with Noakes. The backlash was instant. On Twitter, total strangers called me irresponsible, unscientific, unethical and biased. Astonishingly, some were medical doctors, mostly former students of Noakes. They said that I was Noakes’s ‘cheerleader’, and even accused me of having a ‘crush’ on him. Some said that Noakes must have been paying me handsomely to say nice things about him. (For the record, he has never paid me anything, nor would he think to offer to pay me or I to accept.) Others said I was a ‘closet Banter’, as if that was the worst possible insult.

At first I was irritated. After all, I had quoted Noakes accurately. I had reflected what critics said about him, to ensure that I gave both sides. And anyway, I readily confess to bias, but only in favour of good science. I’ve always said that if anyone can show me robust evidence that Noakes is wrong about LCHF, I will publish it. Knowing him as I do, so will he.

Most of all, though, I was shocked at the venom behind the attacks on Noakes. He had simply done what any good scientist does when faced with compelling evidence that contradicts a belief: he had changed his mind. I’ve never seen much sense in having a mind if you can’t change it.

The attacks against him grew more gratuitously vicious and libellous. Then, in July 2014, researchers at UCT and the University of Stellenbosch published a study in PLoS One that became known as the Naudé review. 4

In August 2014, four of Noakes’s UCT colleagues published a letter in the Cape Times . Dubbed the UCT professors’ letter, it accused him of ‘making outrageous unproven claims about disease prevention’ and of ‘not conforming to the tenets of good and responsible science’. […]

As I continued my research, it became apparent why so many doctors, dietitians, and food and drug industries want to silence Noakes. He threatens their businesses, reputations, careers, funding and sponsors. And cardiologists and endocrinologists are not the only ones at risk of class-action lawsuits if, or more likely when, LCHF diets become mainstream, especially to treat health problems such as obesity, diabetes and heart disease. All doctors and dietitians may be at risk if it is shown that they knew about LCHF but deliberately chose not to offer it as an option to their patients.

When the HPCSA eventually charged Noakes in late 2014 with allegedly giving unconventional advice to a breastfeeding mother on Twitter, I began to prepare to report on the hearing. The deeper I dug, the more unpleasant the experience became. In 2015, for example, I was having what I thought was a relatively civil phone call with Johannesburg cardiologist Dr Anthony Dalby. I asked for comment on research suggesting that the diet-heart hypothesis was unproven. ‘If you believe that, then I leave it to you,’ he said, and hung up on me. Other doctors, academics and dietitians followed suit, avoiding my emails, or slamming the phone down if I ever managed to get past their gatekeepers.

Teicholz told me of similar experiences while doing research for The Big Fat Surpris e . In response to a question on fat, an interviewee suddenly said, ‘I can’t talk about that,’ and hung up. Teicholz was shaken. ‘It felt as if I had been investigating organised crime,’ she said. The analogy was apt for her then. It became apt for me too.

The wall of silence I came up against while reporting on the HPCSA hearing should not have surprised me. I had a good working relationship with Claire Julsing Strydom, the dietitian who laid the initial complaint against Noakes – that is, until I started writing about her role in the whole affair. Strydom was president of the Association for Dietetics in South Africa when she lodged the complaint. Once I began asking uncomfortable questions, she stopped talking to me. ADSA executives and academics have followed suit, clearly acting on legal advice.

Like many, I enjoy a good conspiracy theory. However, at the first abortive attempt at a hearing session in June 2015, I wasn’t convinced of an organised campaign to discredit Noakes. By the trial’s end, I was.

Strydom and ADSA deny a vendetta against Noakes. Yet the signs were always there. Another ADSA executive member, Catherine ‘Katie’ Pereira, lodged a complaint with the HPCSA against Noakes in 2014 that was even more frivolous than Strydom’s. During an interview for a newspaper, Noakes had said that he didn’t know of any dietitian who told poor people not to drink Coca-Cola and eat potato crisps. (Most orthodox dietitians I know tell people that it’s fine to eat and drink these products as long as they do so ‘in moderation’.) The journalist made that comment a focus of the published interview. Pereira was offended on behalf of the entire dietetic profession. The HPCSA initially – and sensibly, to my mind – declined to prosecute. Strydom then intervened and pleaded with the HPCSA to charge Noakes. That case is still pending.

Nevertheless, to me, Strydom and ADSA have always looked more like patsies – proxies for Big Food and other vested interests opposed to Noakes. And this book turned into not so much a ‘whodunnit’ than a ‘why they dunnit?’.

pp. 32-34
Introduction by Marika Sboros

This is the story of a remarkable scientific journey. Just as remarkable is the genesis of that journey: a single, innocuous tweet.

In February 2014 , a Twitter user asked a distinguished and world-renowned scientist a simple question: ‘Is LCHF eating ok for breastfeeding mums? Worried about all the dairy + cauliflower = wind for babies??’

Always willing to engage with an inquiring mind, Professor Tim Noakes tweeted back: ‘Baby doesn’t eat the dairy and cauliflower. Just very healthy high fat breast milk. Key is to ween [ si c ] baby onto LCHF.’

With those few words, Noakes set off a chain of events that would eventually see him charged with unprofessional conduct, caught up in a case that would drag on for more than three years and cost many millions of rands. More difficult, if not impossible, to quantify is the devastating emotional toll that the whole ordeal has taken on him and his family, as critics attacked his character and scientific reputation at every turn.

At the time, it was open season on Tim Noakes. Doctors, dietitians and assorted academics from South Africa’s top universities had been hard at work for years trying to discredit him. They did not like his scientific views on low-carbohydrate, high-fat foods, which he had been promoting since 2011 . His opinions contrasted sharply with conventional, orthodox dietary ‘wisdom’, and the tweet provided the perfect pretext to amp up their attacks and hopefully silence him once and for all.

Within 24 hours of his tweet, a dietitian had reported him to the Health Professions Council of South Africa for giving what she considered ‘incorrect’, ‘dangerous’ and ‘potentially life-threatening’ advice. To Noakes’s surprise, the HPCSA took her complaint seriously.

Noakes is one of the few scientists in the world with an A 1 rating from the South African National Research Foundation (NRF) for both sports science and nutrition. In his home country, he has no equal in terms of expertise in and research into LCHF. Few can match his large academic footprint – quantified by an H-index of over 70 . The H- or Hirsch index is a measure of the impact of a scientist’s work. Noakes’s impact is significant. He has published more than 500 scientific papers, many of them in peer-reviewed journals, and over 40 of which deal exclusively with nutrition. He has been cited more than 17 000 times in the scientific literature.

Yet, remarkably, the HPCSA chose to back the opinion of a dietitian in private practice over an internationally renowned nutrition research scientist. They charged him with ‘unprofessional conduct’ for providing ‘unconventional advice on breastfeeding babies on social networks’ and hauled him through the humiliating process of a disciplinary hearing.

The public quickly dubbed it ‘the Nutrition Trial of the 21 st Century’. I’ve called it Kafkaesque. The HPCSA insisted that it was a hearing, not a trial, but the statutory body’s own conduct belied the claim.

At the time of Noakes’s tweet, I wanted to give up journalism. After more than 30 years of researching and writing about medicine and nutrition science, I was frustrated and bored. People were growing fatter and sicker, and the medical and dietetic specialists I wrote about weren’t making much difference to patients’ lives. Neither was my reporting.

Then I started investigating and writing about the HPCSA’s case against Noakes. The more questions I asked, the more walls of silence came up around me, and from the most unexpected sources. There’s an old saying that silence isn’t empty, it is full of answers. I found that the silence was loudest from those with the most to hide. I could not have foreseen the labyrinthine extent of vested inter ests ranged against Noakes, or the role played by shadowy proxy organisations for multinational sugar and soft-drink companies in suppressing and discrediting nutrition evidence.

It took a US investigative journalist to join many of the dots I had identified. Russ Greene’s research led to the International Life Sciences Institute (ILSI), a Coca-Cola front organisation. In an explosive exposé in January 2017 , Greene showed how the ILSI has worked to support the nutrition status quo in South Africa, as well as the health professionals and food and drug industries that benefit from it. It has opened a branch in South Africa and has funded nutrition congresses throughout the country. It has also paid for dietitians and academics opposed to Noakes and LCHF to address conferences abroad . *

Of course, it might be coincidence that so many doctors, dietitians and academics with links to the ILSI became involved, directly and indirectly, in the HPCSA’s prosecution of Noakes. Then again, maybe not.

The HPCSA’s conduct throughout the hearing and since its conclusion has been revelatory. To a large extent, it confirms the premise of this book: that those in positions of power and influence in medicine and academia were using the case to pursue a vendetta against Noakes. The trial highlighted the inherent perils facing those brave enough to go against orthodoxy. It is in Noakes’s DNA as a scientist to seek truth and challenge dogma. He has done it many times before and has been proved right every time. I have no doubt that this time will be no different. On this latest journey, he has demonstrated the unflinching courage, integrity and dignity that are his hallmarks as one of the most eminent scientists of his time.

pp. 112-113

In retrospect, I could not then appreciate the extent to which the Centenary Debate was the opening salvo of what I believe to have been a much wider campaign, the ultimate goal of which was to silence me through public humiliation. It is a well-known technique called refutation by denigration. My perception is that if the actions of my colleagues meant that my status as an A1-rated scientist, who had contributed greatly to the scientific and financial efforts of UCT’s Faculty of Health Sciences over 35 years, was destroyed, well, in their opinion, that was just too bad. According to their worldview, I was the architect of my own downfall.

Only later, when I read Alice Dreger’s Galileo’s Middle Finger: Heretics, Activists, and One Scholar’s Search for Justice , did I begin to appreciate what I was really up against. Dreger’s book explores the unrelenting battle between scholars who put the pursuit of hard truths ahead of personal comfort and the social activists determined to silence them. She uses the voice of the social activist to explain what drives activists in their battles with empirical science and scientists:

We have to use our privilege to advance the rights of the marginalized. We can’t let [scientists] say what is true about the world. We have to give voice and power to the oppressed and let them say what is true about the world. Science is as biased as all human endeavors, and so we have to empower the disempowered, and speak always with them. 64

The difference, of course, is that the activists I was facing, in my view, were not motivated to advance the voices of the oppressed and disempowered, but, either wittingly or by proxy, rather the opposite.

In the face of this, what is the responsibility of those scientists who see their role as the pursuit of ‘truth’? Dreger’s answer is this:

To scholars I want to say more: Our fellow human beings can’t afford to have us act like cattle in an industrial farming system. If we take seriously the importance of truth to justice and recognize the many factors now acting against the pursuit of knowledge – if we really get why our role in democracy is like no other – then we really ought to feel that we must do more to protect each other and the public from misinformation and disinformation … 65

We scholars had to put the search for evidence before everything else, even when the evidence pointed to facts we did not want to see. The world needed that of us, to maintain – by our example, by our very existence – a world that would keep learning and questioning, that would remain free in thought, inquiry, and word. 66

In the end, she concludes: ‘Justice cannot be determined merely by social position. Justice cannot be advanced by letting “truth” be determined by political goals.’ 67 Nor, I might add, can commercial interests be allowed to determine what is the ‘truth’.

Dreger’s final message is this: ‘Evidence really is an ethical issue, the most important ethical issue in a modern democracy. If you want justice, you must work for truth. And if you want to work for truth, you must do a little more than wish for justice.’ 68

As the media onslaught began, I did not understand that these academic activists seemingly did not care about the science. Neither did the tabloid journalists or Twitter trolls, including some medical colleagues, who at about the same time began to target me on social media. Were they also willing co-conspirators in the rush to silence my voice?

pp. 123-129

At the time, I was en route to the Western Cape nature reserve Bartholomeus Klip, near the village of Hermon. Early the next morning, I opened my email and read the attached letter with growing incredulity. It carried the names (but not signatures) of four UCT academics, as well as – importantly – the logos of UCT and the UCT Faculty of Health Sciences. It therefore, in effect, signalled my ultimate academic rejection by all members of the university, and especially the medical faculty that I had served with distinction for 35 years. Only the deaths of my parents, Bob Woolmer and a few other close friends surpassed the emotional devastation this email caused me. […]

What struck me most about the letter was its cruelty and inhumanity, and that the authors showed not the slightest hint of conscience in publicly shaming me. Medicine is meant to be a caring profession in which we are concerned with the emotional health and needs of not just our patients, but also our colleagues and students. De Villiers appears to understand this. When he was eventually appointed rector and vice chancellor of the University of Stellenbosch in December 2014, his university profile stated: ‘He believes the University should offer an experience that is pleasant, welcoming and hospitable – in an inclusive environment.’ 7 Those admirable sentiments were remarkable for their absence from the Cape Times professors’ letter.

Instead, the letter is a textbook example of academic bullying, a topic recently reviewed by Dr Fleur Howells, senior lecturer in psychiatry at UCT. Howells writes that there are three forms of academic bullying. The third, ‘social bullying, also known as relational aggression, is the deliberate or active exclusion or damage to the social standing of the victim through, for example, publicly undermining a junior academic’s viewpoint’. 8 The four key components of bullying are intent to harm, experience of harm, exploitation of power and aggression. The professors’ letter thus neatly fulfils all the diagnostic criteria for academic bullying.

Jacqui Hoepner is currently completing her PhD thesis at the Australian National University, studying the use of these bullying tactics to suppress or silence dissenting scientific opinions. 9 In a discussion with Daryl Ilbury, author of Tim Noakes: The Quiet Maveric k , Hoepner disclosed her original assumption that most cases of academic suppression or silencing arise from outside academic circles. To her surprise, she discovered the opposite – ‘the bulk of suppression or silencing came from within academia, from colleagues and competitors’, she told Ilbury. ‘This suggests that the assumed model of respect and disagreement between academics is inaccurate.’

Hoepner was astonished to uncover 43 different ‘silencing behaviours’ that fly in the face of the concept of academic freedom: ‘Every policy and university guideline I looked at suggested that academic freedom was absolutely central to what academics do and their place in society … [But] there’s a real disconnect between what academics think they are guaranteed under academic freedom and what the reality is for the life of an academic.’

She also discovered that the nature of these silencing attacks was ‘more of a personal gut response: that someone has crossed a boundary and we need to punish them. The exact motivation differed from case to case, but it seemed very much a visceral response.’

Typically, attacks are ad hominem, with accusations of conflicts of interest ‘to undermine credibility … without any attempt by the claimant of the accusations to provide any evidence’; and with allegations such as ‘You’re doing real harm’, ‘You’re causing confusion’ or you’re undermining the public’s faith in science; and ending with summons that the researcher be ‘fired or disciplined in some way’.

Perhaps with direct relevance to my experience, Hoepner said: ‘If a scientist discovers evidence that contradicts decades of public health messaging and says that data doesn’t support the messaging, and that person is attacked, and publicly … that’s insane!’

Returning to the professors’ letter, it is also blatantly defamatory because it implies that I, as a medical practitioner: promote a diet that may cause harm (‘heart disease, diabetes mellitus, kidney problems … certain cancers’); make ‘outrageous unproven claims’; malign the integrity and credibility of peers who disagree with me; and undertake research that is not ‘socially responsible’ in the judgement of UCT.

The letter also breaches the HPCSA’s own ethical guidelines. Professor Bongani Mayosi, another signatory to the letter, was involved at that time in a review of the HPCSA management and functioning, and therefore should have been well versed in the ethical guidelines of the organisation he was investigating.

pp. 145-146

I presented De Villiers and Mayosi with copies of Nina Teicholz’s book, The Big Fat Surprise , and an editorial published the previous week in the British Medical Journal ( BMJ ). The editorial was a review of Teicholz’s book written by a former BMJ editor, Dr Richard Smith. 26 In it, he wrote the following:

By far the best of the books I’ve read to write this article is Nina Teicholz’s The Big Fat Surprise , whose subtitle is ‘Why butter, meat, and cheese belong in a healthy diet.’ The title, the subtitle, and the cover of the book are all demeaning, but the forensic demotion of the hypothesis that saturated fat is the cause of cardiovascular disease is impressive. Indeed, the book is deeply disturbing in showing how overenthusiastic scientists, poor science, massive conflicts of interest, and politically driven policy makers can make deeply damaging mistakes. Over 40 years I’ve come to recognize what I might have known from the beginning that science is a human activity with the error, self deception, grandiosity, bias, self interest, cruelty, fraud and theft that is inherent in all human activities (together with some saintliness), but this book shook me.

After describing the bad science underlying all aspects of Ancel Keys’s diet-heart hypothesis, Smith concluded:

Reading these books and consulting some of the original studies has been a sobering experience. The successful attempt to reduce fat in the diet of Americans and others around the world has been a global, uncontrolled experiment, which like all experiments may well have led to bad outcomes. What’s more, it has initiated a further set of uncontrolled global experiments that are continuing. Teicholz has done a remarkable job in analyzing how weak science, strong personalities, vested interests, and political expediency have initiated this series of experiments. She quotes Nancy Harmon Jenkins, author of the Mediterranean Diet Cookbook and one of the founders of Oldways, as saying, ‘The food world is particularly prey to consumption, because so much money is made on food and so much depends on talk and especially the opinions of experts.’ It’s surely time for better science and for humility among experts.

In 2017, the other great British medical journal,

The Lancet , published a similar review, concluding: ‘This is a disquieting book about scientific incompetence, evangelical ambition, and ruthless silencing of dissent that shaped our lives for decades … Researchers, clinicians, and health policy advisers should read this provocative book that reminds us about the importance of good science and the need to challenge dogma.’ 27