Inequality in the Anthropocene

This post was inspired by an article on the possibility of increasing suicides because of climate change. What occurred to me is that all the social and psychological problems seen with climate change are also seen with inequality (as shown in decades of research), and to a lesser extent as seen with extreme poverty — although high poverty with low inequality isn’t necessarily problematic at all (e.g., the physically and psychologically healthy hunter-gatherers who are poor in terms of material wealth and private property).

Related to this, I noticed in one article that a study was mentioned about the chances of war increasing when detrimental weather events are combined with ethnic diversity. And that reminded me of the research that showed diversity only leads to lowered trust when combined with segregation. A major problem with climate-related refugee crises is that it increases segregation, such as refugee camps and immigrant ghettoization. That segregation will lead to further conflict and destruction of the social fabric, which in turn will promote further segregation — a vicious cycle that will be hard to pull out before the crash, especially as the environmental conditions lead to droughts, famines, and plagues.

As economic and environmental conditions worsen, there are some symptoms that will become increasingly apparent and problematic. Based on the inequality and climatology research, we should expect increased stress, anxiety, fear, xenophobia, bigotry, suicide, homicide, aggressive behavior, short-term thinking, reactionary politics, and generally crazy and bizarre behavior. This will likely result in civil unrest, violent conflict, race wars, genocides, terrorism, militarization, civil wars, revolutions, international conflict, resource-based wars, world wars, authoritarianism, ethno-nationalism, right-wing populism, etc.

The only defense against this will be a strong, courageous left-wing response. That would require eliminating not only the derangement of the GOP but also the corruption of the DNC by replacing both with a genuinely democratic and socialist movement. Otherwise, our society will descend into collective madness and our entire civilization will be under existential threat. There is no other option.

* * *

The Great Acceleration and the Great Divergence: Vulnerability in the Anthropocene
by Rob Nixon

Most Anthropocene scholars date the new epoch to the late-eighteenth-century beginnings of industrialization. But there is a second phase to the Anthropocene, the so-called great acceleration, beginning circa 1950: an exponential increase in human-induced changes to the carbon cycle and nitrogen cycle and in ocean acidification, global trade, and consumerism, as well as the rise of international forms of governance like the World Bank and the IMF.

However, most accounts of the great acceleration fail to position it in relation to neoliberalism’s recent ascent, although most of the great acceleration has occurred during the neoliberal era. One marker of neoliberalism has been a widening chasm of inequality between the superrich and the ultrapoor: since the late 1970s, we have been living through what Timothy Noah calls “the great divergence.” Noah’s subject is the economic fracturing of America, the new American gilded age, but the great divergence has scarred most societies, from China and India to Indonesia, South Africa, Nigeria, Italy, Spain, Ireland, Costa Rica, Jamaica, Australia, and Bangladesh.

My central problem with the dominant mode of Anthropocene storytelling is its failure to articulate the great acceleration to the great divergence. We need to acknowledge that the grand species narrative of the Anthropocene—this geomorphic “age of the human”—is gaining credence at a time when, in society after society, the idea of the human is breaking apart economically, as the distance between affluence and abandonment is increasing. It is time to remold the Anthropocene as a shared story about unshared resources. When we examine the geology of the human, let us also pay attention to the geopolitics of the new stratigraphy’s layered assumptions.

Neoliberalism loves watery metaphors: the trickle-down effect, global flows, how a rising tide lifts all boats. But talk of a rising tide raises other specters: the coastal poor, who will never get storm-surge barriers; Pacific Islanders in the front lines of inundation; Arctic peoples, whose livelihoods are melting away—all of them exposed to the fallout from Anthropocene histories of carbon extraction and consumption in which they played virtually no part.

We are not all in this together
by Ian Angus

So the 21st century is being defined by a combination of record-breaking inequality with record-breaking climate change. That combination is already having disastrous impacts on the majority of the world’s people. The line is not only between rich and poor, or comfort and poverty: it is a line between survival and death.

Climate change and extreme weather events are not devastating a random selection of human beings from all walks of life. There are no billionaires among the dead, no corporate executives living in shelters, no stockbrokers watching their children die of malnutrition. Overwhelmingly, the victims are poor and disadvantaged. Globally, 99 percent of weather disaster casualties are in developing countries, and 75 percent of them are women.

The pattern repeats at every scale. Globally, the South suffers far more than the North. Within the South, the very poorest countries, mostly in Africa south of the Sahara, are hit hardest. Within each country, the poorest people—women, children, and the elderly—are most likely to lose their homes and livelihoods from climate change, and most likely to die.

The same pattern occurs in the North. Despite the rich countries’ overall wealth, when hurricanes and heatwaves hit, the poorest neighborhoods are hit hardest, and within those neighborhoods the primary victims are the poorest people.

Chronic hunger, already a severe problem in much of the world, will be made worse by climate change. As Oxfam reports: “The world’s most food-insecure regions will be hit hardest of all.”

Unchecked climate change will lock the world’s poorest people in a downward spiral, leaving hundreds of millions facing malnutrition, water scarcity, ecological threats, and loss of livelihood. Children will be among the primary victims, and the effects will last for lifetimes: studies in Ethiopia, Kenya, and Niger show that being born in a drought year increases a child’s chances of being irreversibly stunted by 41 to 72 percent.

Environmental racism has left black Americans three times more likely to die from pollution
By Bartees Cox

Without a touch of irony, the EPA celebrated Black History Month by publishing a report that finds black communities face dangerously high levels of pollution. African Americans are more likely to live near landfills and industrial plants that pollute water and air and erode quality of life. Because of this, more than half of the 9 million people living near hazardous waste sites are people of color, and black Americans are three times more likely to die from exposure to air pollutants than their white counterparts.

The statistics provide evidence for what advocates call “environmental racism.” Communities of color aren’t suffering by chance, they say. Rather, these conditions are the result of decades of indifference from people in power.

Environmental racism is dangerous. Trump’s EPA doesn’t seem to care.
by P.R. Lockhart

Studies have shown that black and Hispanic children are more likely to develop asthma than their white peers, as are poor children, with research suggesting that higher levels of smog and air pollution in communities of color being a factor. A 2014 study found that people of color live in communities that have more nitrogen dioxide, a pollutant that exacerbates asthma.

The EPA’s own research further supported this. Earlier this year, a paper from the EPA’s National Center for Environmental Assessment found that when it comes to air pollutants that contribute to issues like heart and lung disease, black people are exposed to 1.5 times more of the pollutant than white people, while Hispanic people were exposed to about 1.2 times the amount of non-Hispanic whites. People in poverty had 1.3 times the exposure of those not in poverty.

Trump’s EPA Concludes Environmental Racism Is Real
by Vann R. Newkirk II

Late last week, even as the Environmental Protection Agency and the Trump administration continued a plan to dismantle many of the institutions built to address those disproportionate risks, researchers embedded in the EPA’s National Center for Environmental Assessment released a study indicating that people of color are much more likely to live near polluters and breathe polluted air. Specifically, the study finds that people in poverty are exposed to more fine particulate matter than people living above poverty. According to the study’s authors, “results at national, state, and county scales all indicate that non-Whites tend to be burdened disproportionately to Whites.”

The study focuses on particulate matter, a group of both natural and manmade microscopic suspensions of solids and liquids in the air that serve as air pollutants. Anthropogenic particulates include automobile fumes, smog, soot, oil smoke, ash, and construction dust, all of which have been linked to serious health problems. Particulate matter was named a known definite carcinogen by the International Agency for Research on Cancer, and it’s been named by the EPA as a contributor to several lung conditions, heart attacks, and possible premature deaths. The pollutant has been implicated in both asthma prevalence and severitylow birth weights, and high blood pressure.

As the study details, previous works have also linked disproportionate exposure to particulate matter and America’s racial geography. A 2016 study in Environment International found that long-term exposure to the pollutant is associated with racial segregation, with more highly segregated areas suffering higher levels of exposure. A 2012 article in Environmental Health Perspectives found that overall levels of particulate matter exposure for people of color were higher than those for white people. That article also provided a breakdown of just what kinds of particulate matter counts in the exposures. It found that while differences in overall particulate matter by race were significant, differences for some key particles were immense. For example, Hispanics faced rates of chlorine exposure that are more than double those of whites. Chronic chlorine inhalation is known for degrading cardiac function.

The conclusions from scientists at the National Center for Environmental Assessment not only confirm that body of research, but advance it in a top-rate public-health journal. They find that black people are exposed to about 1.5 times more particulate matter than white people, and that Hispanics had about 1.2 times the exposure of non-Hispanic whites. The study found that people in poverty had about 1.3 times more exposure than people above poverty. Interestingly, it also finds that for black people, the proportion of exposure is only partly explained by the disproportionate geographic burden of polluting facilities, meaning the magnitude of emissions from individual factories appears to be higher in minority neighborhoods.

These findings join an ever-growing body of literature that has found that both polluters and pollution are often disproportionately located in communities of color. In some places, hydraulic-fracturing oil wells are more likely to be sited in those neighborhoods. Researchers have found the presence of benzene and other dangerous aromatic chemicals to be linked to race. Strong racial disparities are suspected in the prevalence of lead poisoning.

It seems that almost anywhere researchers look, there is more evidence of deep racial disparities in exposure to environmental hazards. In fact, the idea of environmental justice—or the degree to which people are treated equally and meaningfully involved in the creation of the human environment—was crystallized in the 1980s with the aid of a landmark study illustrating wide disparities in the siting of facilities for the disposal of hazardous waste. Leaders in the environmental-justice movement have posited—in places as prestigious and rigorous as United Nations publications and numerous peer-reviewed journals—that environmental racism exists as the inverse of environmental justice, when environmental risks are allocated disproportionately along the lines of race, often without the input of the affected communities of color.

The idea of environmental racism is, like all mentions of racism in America, controversial. Even in the age of climate change, many people still view the environment mostly as a set of forces of nature, one that cannot favor or disfavor one group or another. And even those who recognize that the human sphere of influence shapes almost every molecule of the places in which humans live, from the climate to the weather to the air they breathe, are often loathe to concede that racism is a factor. To many people, racism often connotes purposeful decisions by a master hand, and many see existing segregation as a self-sorting or poverty problem. Couldn’t the presence of landfills and factories in disproportionately black neighborhoods have more to do with the fact that black people tend to be disproportionately poor and thus live in less desirable neighborhoods?

But last week’s study throws more water on that increasingly tenuous line of thinking. While it lacks the kind of complex multivariate design that can really disentangle the exact effects of poverty and race, the finding that race has a stronger effect on exposure to pollutants than poverty indicates that something beyond just the concentration of poverty among black people and Latinos is at play. As the study’s authors write: “A focus on poverty to the exclusion of race may be insufficient to meet the needs of all burdened populations.” Their finding that the magnitude of pollution seems to be higher in communities of color than the number of polluters suggests, indicates that regulations and business decisions are strongly dependent on whether people of color are around. In other words, they might be discriminatory.

This is a remarkable finding, and not only because it could provide one more policy linkage to any number of health disparities, from heart disease to asthma rates in black children that are double those of white children. But the study also stands as an implicit rebuke to the very administration that allowed its release.

Violence: Categories & Data, Causes & Demographics

Most violent crime correlates to social problems in general. Most social problems in general correlate to economic factors such as poverty but even moreso inequality. And in a country like the US, most economic factors correlate to social disadvantage and racial oppression, from economic segregation (redlining, sundown towns, etc) to environmental racism (ghettos located in polluted urban areas, high toxicity rates among minorities, etc) — consider how areas of historically high rates of slavery at present have higher levels of poverty and inequality, impacting not just blacks but also whites living in those communities.

Socialized Medicine & Externalized Costs

About 40 percent of deaths worldwide are caused by water, air and soil pollution, concludes a Cornell researcher. Such environmental degradation, coupled with the growth in world population, are major causes behind the rapid increase in human diseases, which the World Health Organization has recently reported. Both factors contribute to the malnourishment and disease susceptibility of 3.7 billion people, he says.

Percentages of Suffering and Death

Even accepting the data that Pinker uses, it must be noted that he isn’t including all violent deaths. Consider economic sanctions and neoliberal exploitation, vast poverty and inequality forcing people to work long hours in unsafe and unhealthy conditions, covert operations to overthrow governments and destabilize regions, anthropogenic climate change with its disasters, environmental destruction and ecosystem collapse, loss of arable land and food sources, pollution and toxic dumps, etc. All of this would involve food scarcity, malnutrition, starvation, droughts, rampant disease, refugee crises, diseases related to toxicity and stress, etc; along with all kinds of other consequences to people living in desperation and squalor.

This has all been intentionally caused through governments, corporations, and other organizations seeking power and profit while externalizing costs and harm. In my lifetime, the fatalities to this large scale often slow violence and intergenerational trauma could add up to hundreds of millions or maybe billions of lives cut short. Plus, as neoliberal globalization worsens inequality, there is a direct link to higher rates of homicides, suicides, and stress-related diseases for the most impacted populations. Yet none of these deaths would be counted as violent, no matter how horrific it was for the victims. And those like Pinker adding up the numbers would never have to acknowledge this overwhelming reality of suffering. It can’t be seen in the official data on violence, as the causes are disconnected from the effects. But why should only a small part of the harm and suffering get counted as violence?

Learning to Die in the Anthropocene: Reflections on the End of a Civilization
by Roy Scranton
Kindle Locations 860-888 (see here)

Consider: Once among the most modern, Westernized nations in the Middle East, with a robust, highly educated middle class, Iraq has been blighted for decades by imperialist aggression, criminal gangs, interference in its domestic politics, economic liberalization, and sectarian feuding. Today it is being torn apart between a corrupt petrocracy, a breakaway Kurdish enclave, and a self-declared Islamic fundamentalist caliphate, while a civil war in neighboring Syria spills across its borders. These conflicts have likely been caused in part and exacerbated by the worst drought the Middle East has seen in modern history. Since 2006, Syria has been suffering crippling water shortages that have, in some areas, caused 75 percent crop failure and wiped out 85 percent of livestock, left more than 800,000 Syrians without a livelihood, and sent hundreds of thousands of impoverished young men streaming into Syria’s cities. 90 This drought is part of long-term warming and drying trends that are transforming the Middle East. 91 Not just water but oil, too, is elemental to these conflicts. Iraq sits on the fifth-largest proven oil reserves in the world. Meanwhile, the Islamic State has been able to survive only because it has taken control of most of Syria’s oil and gas production. We tend to think of climate change and violent religious fundamentalism as isolated phenomena, but as Retired Navy Rear Admiral David Titley argues, “you can draw a very credible climate connection to this disaster we call ISIS right now.” 92

A few hundred miles away, Israeli soldiers spent the summer of 2014 killing Palestinians in Gaza. Israel has also been suffering drought, while Gaza has been in the midst of a critical water crisis exacerbated by Israel’s military aggression. The International Committee for the Red Cross reported that during summer 2014, Israeli bombers targeted Palestinian wells and water infrastructure. 93 It’s not water and oil this time, but water and gas: some observers argue that Israel’s “Operation Protective Edge” was intended to establish firmer control over the massive Leviathan natural gas field, discovered off the coast of Gaza in the eastern Mediterranean in 2010.94

Meanwhile, thousands of miles to the north, Russian-backed separatists fought fascist paramilitary forces defending the elected government of Ukraine, which was also suffering drought. 95 Russia’s role as an oil and gas exporter in the region and the natural gas pipelines running through Ukraine from Russia to Europe cannot but be key issues in the conflict. Elsewhere, droughts in 2014 sent refugees from Guatemala and Honduras north to the US border, devastated crops in California and Australia, and threatened millions of lives in Eritrea, Somalia, Ethiopia, Sudan, Uganda, Afghanistan, India, Morocco, Pakistan, and parts of China. Across the world, massive protests and riots have swept Bosnia and Herzegovina, Venezuela, Brazil, Turkey, Egypt, and Thailand, while conflicts rage on in Colombia, Libya, the Central African Republic, Sudan, Nigeria, Yemen, and India. And while the world burns, the United States has been playing chicken with Russia over control of Eastern Europe and the melting Arctic, and with China over control of Southeast Asia and the South China Sea, threatening global war on a scale not seen in seventy years. This is our present and future: droughts and hurricanes, refugees and border guards, war for oil, water, gas, and food.

Donald Trump Is the First Demagogue of the Anthropocene
by Robinson Meyer

First, climate change could easily worsen the inequality that has already hollowed out the Western middle class. A recent analysis in Nature projected that the effects of climate change will reduce the average person’s income by 23 percent by the end of the century. The U.S. Environmental Protection Agency predicts that unmitigated global warming could cost the American economy $200 billion this century. (Some climate researchers think the EPA undercounts these estimates.)

Future consumers will not register these costs so cleanly, though—there will not be a single climate-change debit exacted on everyone’s budgets at year’s end. Instead, the costs will seep in through many sources: storm damage, higher power rates, real-estate depreciation, unreliable and expensive food. Climate change could get laundered, in other words, becoming just one more symptom of a stagnant and unequal economy. As quality of life declines, and insurance premiums rise, people could feel that they’re being robbed by an aloof elite.

They won’t even be wrong. It’s just that due to the chemistry of climate change, many members of that elite will have died 30 or 50 years prior. […]

Malin Mobjörk, a senior researcher at the Stockholm International Peace Research Institute, recently described a “growing consensus” in the literature that climate change can raise the risk of violence. And the U.S. Department of Defense already considers global warming a “threat multiplier” for national security. It expects hotter temperatures and acidified oceans to destabilize governments and worsen infectious pandemics.

Indeed, climate change may already be driving mass migrations. Last year, the Democratic presidential candidate Martin O’Malley was mocked for suggesting that a climate-change-intensified drought in the Levant—the worst drought in 900 years—helped incite the Syrian Civil War, thus kickstarting the Islamic State. The evidence tentatively supports him. Since the outbreak of the conflict, some scholars have recognized that this drought pushed once-prosperous farmers into Syria’s cities. Many became unemployed and destitute, aggravating internal divisions in the run-up to the war. […]

They were not disappointed. Heatwaves, droughts, and other climate-related exogenous shocks do correlate to conflict outbreak—but only in countries primed for conflict by ethnic division. In the 30-year period, nearly a quarter of all ethnic-fueled armed conflict coincided with a climate-related calamity. By contrast, in the set of all countries, war only correlated to climatic disaster about 9 percent of the time.

“We cannot find any evidence for a generalizable trigger relationship, but we do find evidence for some risk enhancement,” Schleussner told me. In other words,  climate disaster will not cause a war, but it can influence whether one begins.

Why climate change is very bad for your health
by Geordan Dickinson Shannon

Ecosystems

We don’t live in isolation from other ecosystems. From large-scale weather events, through to the food we eat daily, right down to the minute organisms colonising our skin and digestive systems, we live and breath in co-dependency with our environment.

A change in the delicate balance of micro-organisms has the potential to lead to disastrous effects. For example, microbial proliferation – which is predicted in warmer temperatures driven by climate change – may lead to more enteric infections (caused by viruses and bacteria that enter the body through the gastrointestinal tract), such as salmonella food poisoning and increased cholera outbreaks related to flooding and warmer coastal and estuarine water.

Changes in temperature, humidity, rainfall, soil moisture and sea-level rise, caused by climate change is also affecting the transmission of dangerous insect-borne infectious diseases. These include malaria, dengue, Japanese encephalitis, chikungunya and West Nile viruslymphatic filariasis, plague, tick-borne encephalitis, Lyme diseaserickettsioses, and schistosomiasis.

Through climate change, the pattern of human interaction will likely change and so will our interactions with disease-spreading insects, especially mosquitoes. The World Health Organisation has also stressed the impact of climate change on the reproductive, survival and bite rates of insects, as well as their geographic spread.

Climate refugees

Perhaps the most disastrous effect of climate change on human health is the emergence of large-scale forced migration from the loss of local livelihoods and weather events – something that is recognised by the United Nations High Commission on Human Rights. Sea-level rise, decreased crop yield, and extreme weather events will force many people from their lands and livelihoods, while refugees in vulnerable areas also face amplified conditions such as fewer food supplies and more insect-borne diseases. And those who are displaced put a significant health and economic burden on surrounding communities.

The International Red Cross estimates that there are more environmental refugees than political. Around 36m people were displaced by natural disasters in 2009; a figure that is predicted to rise to more than 50m by 2050. In one worst-case scenario, as many as 200m people could become environmental refugees.

Not a level playing field

Climate change has emerged as a major driver of global health inequalities. As J. Timmons Roberts, professor of Environmental Studies and Sociology at Brown University, put it:

Global warming is all about inequality, both in who will suffer most its effects and in who created the problem in the first place.

Global climate change further polarises the haves and the have-nots. The Intergovernmental Panel on Climate Change predicts that climate change will hit poor countries hardest. For example, the loss of healthy life years in low-income African countries is predicted to be 500 times that in Europe. The number of people in the poorest countries most vulnerable to hunger is predicted by Oxfam International to increase by 20% in 2050. And many of the major killers affecting developing countries, such as malaria, diarrhoeal illnesses, malnutrition and dengue, are highly sensitive to climate change, which would place a further disproportionate burden on poorer nations.

Most disturbingly, countries with weaker health infrastructure – generally situated in the developing world – will be the least able to copewith the effects of climate change. The world’s poorest regions don’t yet have the technical, economic, or scientific capacity to prepare or adapt.

Predictably, those most vulnerable to climate change are not those who contribute most to it. China, the US, and the European Union combined have contributed more than half the world’s total carbon dioxide emissions in the last few centuries. By contrast, and unfairly, countries that contributed the least carbon emissions (measured in per capita emissions of carbon dioxide) include many African nations and small Pacific islands – exactly those countries which will be least prepared and most affected by climate change.

Here’s Why Climate Change Will Increase Deaths by Suicide
by Francis Vergunst, Helen Louise Berry & Massimiliano Orri

Suicide is already among the leading causes of death worldwide. For people aged 15-55 years, it is among the top five causes of death. Worldwide nearly one million people die by suicide each year — more than all deaths from war and murder combined.

Using historical temperature records from the United States and Mexico, the researchers showed that suicide rates increased by 0.7 per cent in the U.S. and by 2.1 per cent in Mexico when the average monthly temperatures rose by 1 C.

The researchers calculated that if global temperatures continue to rise at these rates, between now and 2050 there could be 9,000 to 40,000 additional suicides in the U.S. and Mexico alone. This is roughly equivalent to the number of additional suicides that follow an economic recession.

Spikes during heat waves

It has been known for a long time that suicide rates spike during heat waves. Hotter weather has been linked with higher rates of hospital admissions for self-harmsuicide and violent suicides, as well as increases in population-level psychological distress, particularly in combination with high humidity.

Another recent study, which combined the results of previous research on heat and suicide, concluded there is “a significant and positive association between temperature rises and incidence of suicide.”

Why this is remains unclear. There is a well-documented link between rising temperatures and interpersonal violence and suicide could be understood as an act of violence directed at oneself. Lisa Page, a researcher in psychology at King’s College London, notes:

“While speculative, perhaps the most promising mechanism to link suicide with high temperatures is a psychological one. High temperatures have been found to lead individuals to behave in a more disinhibited, aggressive and violent manner, which might in turn result in an increased propensity for suicidal acts.”

Hotter temperatures are taxing on the body. They cause an increase in the stress hormone cortisol, reduce sleep quality and disrupt people’s physical activity routines. These changes can reduce well-being and increase psychological distress.

Disease, water shortages, conflict and war

The effects of hotter temperatures on suicides are symptomatic of a much broader and more expansive problem: the impact of climate change on mental health.

Climate change will increase the frequency and severity of heat waves, droughts, storms, floods and wildfires. It will extend the range of infectious diseases such as Zika virus, malaria and Lyme disease. It will contribute to food and water shortages and fuel forced migration, conflict and war.

These events can have devastating effects on people’s health, homes and livelihoods and directly impact psychological health and well-being.

But effects are not limited to people who suffer direct losses — for example, it has been estimated that up to half of Hurricane Katrina survivors developed post-traumatic stress disorder even when they had suffered no direct physical losses.

The feelings of loss that follow catastrophic events, including a sense of loss of safety, can erode community well-being and further undermine mental health resilience

The Broken Ladder
by Keith Payne
pp. 3-4 (see here)

[W]hen the level of inequality becomes too large to ignore, everyone starts acting strange.

But they do not act strange in just any old way. Inequality affects our actions and our feelings in the same systematic, predictable fashion again and again. It makes us shortsighted and prone to risky behavior, willing to sacrifice a secure future for immediate gratification. It makes us more inclined to make self-defeating decisions. It makes us believe weird things, superstitiously clinging to the world as we want it to be rather than as it is. Inequality divides us, cleaving us into camps not only of income but also of ideology and race, eroding our trust in one another. It generates stress and makes us all less healthy and less happy.

Picture a neighborhood full of people like the ones I’ve described above: shortsighted, irresponsible people making bad choices; mistrustful people segregated by race and by ideology; superstitious people who won’t listen to reason; people who turn to self-destructive habits as they cope with the stress and anxieties of their daily lives. These are the classic tropes of poverty and could serve as a stereotypical description of the population of any poor inner-city neighborhood or depressed rural trailer park. But as we will see in the chapters ahead, inequality can produce these tendencies even among the middle class and wealthy individuals.

PP. 119-120 (see here)

But how can something as abstract as inequality or social comparisons cause something as physical as health? Our emergency rooms are not filled with people dropping dead from acute cases of inequality. No, the pathways linking inequality to health can be traced through specific maladies, especially heart disease, cancer, diabetes, and health problems stemming from obesity. Abstract ideas that start as macroeconomic policies and social relationships somehow get expressed in the functioning of our cells.

To understand how that expression happens, we have to first realize that people from different walks of life die different kinds of deaths, in part because they live different kinds of lives. We saw in Chapter 2 that people in more unequal states and countries have poor outcomes on many health measures, including violence, infant mortality, obesity and diabetes, mental illness, and more. In Chapter 3 we learned that inequality leads people to take greater risks, and uncertain futures lead people to take an impulsive, live fast, die young approach to life. There are clear connections between the temptation to enjoy immediate pleasures versus denying oneself for the benefit of long-term health. We saw, for example, that inequality was linked to risky behaviors. In places with extreme inequality, people are more likely to abuse drugs and alcohol, more likely to have unsafe sex, and so on. Other research suggests that living in a high-inequality state increases people’s likelihood of smoking, eating too much, and exercising too little.

Essentialism On the Decline

Before getting to the topic of essentialism, let me take an indirect approach. In reading about paleolithic diets and traditional foods, a recurring theme is inflammation, specifically as it relates to the health of the gut-brain network and immune system.

The paradigm change this signifies is that seemingly separate diseases with different diagnostic labels often have underlying commonalities. They share overlapping sets of causal and contributing factors, biological processes and symptoms. This is why simple dietary changes can have a profound effect on numerous health conditions. For some, the diseased state expresses as mood disorders and for others as autoimmune disorders and for still others something entirely else, but there are immense commonalities between them all. The differences have more to do with how dysbiosis and dysfunction happens to develop, where it takes hold in the body, and so what symptoms are experienced.

From a paleo diet perspective in treating both patients and her own multiple sclerosis, Terry Wahls gets at this point in a straightforward manner (p. 47): “In a very real sense, we all have the same disease because all disease begins with broken, incorrect biochemistry and disordered communication within and between our cells. […] Inside, the distinction between these autoimmune diseases is, frankly, fairly arbitrary”. In How Emotions Are Made, Lisa Feldman Barrett wrote (Kindle Locations 3834-3850):

“Inflammation has been a game-changer for our understanding of mental illness. For many years, scientists and clinicians held a classical view of mental illnesses like chronic stress, chronic pain, anxiety, and depression. Each ailment was believed to have a biological fingerprint that distinguished it from all others. Researchers would ask essentialist questions that assume each disorder is distinct: “How does depression impact your body? How does emotion influence pain? Why do anxiety and depression frequently co-occur?” 9

“More recently, the dividing lines between these illnesses have been evaporating. People who are diagnosed with the same-named disorder may have greatly diverse symptoms— variation is the norm. At the same time, different disorders overlap: they share symptoms, they cause atrophy in the same brain regions, their sufferers exhibit low emotional granularity, and some of the same medications are prescribed as effective.

“As a result of these findings, researchers are moving away from a classical view of different illnesses with distinct essences. They instead focus on a set of common ingredients that leave people vulnerable to these various disorders, such as genetic factors, insomnia, and damage to the interoceptive network or key hubs in the brain (chapter 6). If these areas become damaged, the brain is in big trouble: depression, panic disorder, schizophrenia, autism, dyslexia, chronic pain, dementia, Parkinson’s disease, and attention deficit hyperactivity disorder are all associated with hub damage. 10

“My view is that some major illnesses considered distinct and “mental” are all rooted in a chronically unbalanced body budget and unbridled inflammation. We categorize and name them as different disorders, based on context, much like we categorize and name the same bodily changes as different emotions. If I’m correct, then questions like, “Why do anxiety and depression frequently co-occur?” are no longer mysteries because, like emotions, these illnesses do not have firm boundaries in nature.”

What jumped out at me was the conventional view of disease as essentialist, and hence the related essentialism in biology and psychology. This is exemplified by genetic determinism, such as it informs race realism. It’s easy for most well-informed people to dismiss race realists, but essentialism takes on much more insidious forms that are harder to detect and root out. When scientists claimed to find a gay gene, some gay men quickly took this genetic determinism as a defense against the fundamentalist view that homosexuality is a choice and a sin. It turned out that there was no gay gene (by the way, this incident demonstrated how, in reacting to reactionaries, even leftist activists can be drawn into the reactionary mind). Not only is there no gay gene but also no simple and absolute gender divisions at all — as I previously explained (Is the Tide Starting to Turn on Genetics and Culture?):

“Recent research has taken this even further in showing that neither sex nor gender is binary (1234, & 5), as genetics and its relationship to environment, epigenetics, and culture is more complex than was previously realized. It’s far from uncommon for people to carry genetics of both sexes, even multiple DNA. It has to do with diverse interlinking and overlapping causal relationships. We aren’t all that certain at this point what ultimately determines the precise process of conditions, factors, and influences in how and why any given gene expresses or not and how and why it expresses in a particular way.”

The attraction of essentialism is powerful. And as shown in numerous cases, the attraction can be found across the political spectrum, as it offers a seemingly strong defense in diverting attention away from other factors. Similar to the gay gene, many people defend neurodiversity as if some people are simply born a particular way, and that therefore we can’t and shouldn’t seek to do anything to change or improve their condition, much less cure it or prevent it in future generations.

For example, those on the high-functioning end of the autism spectrum will occasionally defend their condition as being gifted in their ability to think and perceive differently. That is fine as far as it goes, but from a scientific perspective we still should find it concerning that conditions like this are on a drastic rise and it can’t be explained by mere greater rates of diagnosis. Whether or not one believes the world would be a better place with more people with autism, this shouldn’t be left as a fatalistic vision of an evolutionary leap, especially considering most on the autism spectrum aren’t high functioning — instead, we should try to understand why it is happening and what it means.

Researchers have found that there are prospective causes to be studied. Consider proprionate, a substance discussed by Alanna Collen (10% Human, p. 83): “although propionate was an important compound in the body, it was also used as a preservative in bread products – the very foods many autistic children crave. To top it all off, clostridia species are known to produce propionate. In itself, propionate is not ‘bad’, but MacFabe began to wonder whether autistic children were getting an overdose.” This might explain why antibiotics helped many with autism, as it would have been knocking off the clostridia population that was boosting propionate. To emphasize this point, when rodents were injected with propionate, they exhibited the precise behaviors of autism and they too showed inflammation in the brain. The fact that autistics often have brain inflammation, an unhealthy condition, is strong evidence that autism shouldn’t be taken as mere neurodiversity (and, among autistics, the commonality of inflammation-related gut issues emphasizes this point).

There is no doubt that genetic determinism, like the belief in an eternal soul, can be comforting. We identify with our genes, as we inherit them and are born with them. But to speak of inflammation or propionate or whatever makes it seem like we are victims of externalities. And it means we aren’t isolated individuals to be blamed or to take credit for who we are. To return to Collen (pp. 88-89):

“In health, we like to think we are the products of our genes and experiences. Most of us credit our virtues to the hurdles we have jumped, the pits we have climbed out of, and the triumphs we have fought for. We see our underlying personalities as fixed entities – ‘I am just not a risk-taker’, or ‘I like things to be organised’ – as if these are a result of something intrinsic to us. Our achievements are down to determination, and our relationships reflect the strength of our characters. Or so we like to think.

“But what does it mean for free will and accomplishment, if we are not our own masters? What does it mean for human nature, and for our sense of self? The idea that Toxoplasma, or any other microbe inhabiting your body, might contribute to your feelings, decisions and actions, is quite bewildering. But if that’s not mind-bending enough for you, consider this: microbes are transmissible. Just as a cold virus or a bacterial throat infection can be passed from one person to another, so can the microbiota. The idea that the make-up of your microbial community might be influenced by the people you meet and the places you go lends new meaning to the idea of cultural mind-expansion. At its simplest, sharing food and toilets with other people could provide opportunity for microbial exchange, for better or worse. Whether it might be possible to pick up microbes that encourage entrepreneurship at a business school, or a thrill-seeking love of motorbiking at a race track, is anyone’s guess for now, but the idea of personality traits being passed from person to person truly is mind-expanding.”

This goes beyond the personal level, which lends a greater threat to the proposal. Our respective societies, communities, etc might be heavily influenced by environmental factors that we can’t see. A ton of research shows the tremendous impact of parasites, heavy metal toxins, food additives, farm chemicals, hormones, hormone mimics, hormone disruptors, etc. Entire regions might be shaped by even a single species of parasite, such as how higher rates of toxoplasmosis gondii in New England is directly correlated to higher rates of neuroticism (see What do we inherit? And from whom? & Uncomfortable Questions About Ideology).

Essentialism, though still popular, has taken numerous major hits in recent years. It once was the dominant paradigm and went largely unquestioned. Consider how early last century respectable fields of study such as anthropology, linguistic relativism and behaviorism suggested that humans were largely products of environmental and cultural factors. This was the original basis of the attack on racism and race realism. In linguistics, Noam Chomsky overturned this view in positing the essentialist belief that, though not observed much less proven, there must exist within the human brain a language module with a universal grammar. It was able to defeat and replace the non-essentialist theories because it was more satisfying to the WEIRD ideologies that were becoming a greater force in an increasingly WEIRD society.

Ever since Plato, Western civilization has been drawn toward the extremes of essentialism (as part of the larger Axial Age shift toward abstraction and idealism). Yet there has also long been a countervailing force (even among the ancients, non-essentialist interpretations were common; consider group identity: here, here, here, here, and here). It wasn’t predetermined that essentialism would be so victorious as to have nearly obliterated the memory of all alternatives. It fit the spirit of the times for this past century, but now the public mood is shifting again. It’s no accident that, as social democracy and socialism regains favor, environmentalist explanations are making a comeback. But this is merely the revival of a particular Western tradition of thought, a tradition that is centuries old.

I was reminded of this in reading Liberty in America’s Founding Moment by Howard Schwartz. It’s an interesting shift of gears, since Schwartz doesn’t write about anything related to biology, health, or science. But he does indirectly get at environmentalist critique that comes out in his analysis of David Hume (1711-1776). I’ve mostly thought of Hume in terms of his bundle theory of self, possibly having been borrowed from Buddhism that he might have learned from Christian missionaries having returned from the East. However he came to it, the bundle theory argued that there is no singular coherent self, as was a central tenet of traditional Christian theology. Still, heretical views of the self were hardly new — some detect a possible Western precursor of Humean bundle theory in the ideas of Baruch Spinoza (1632-1677).

Whatever its origins in Western thought, environmentalism has been challenging essentialism since the Enlightenment. And in the case of Hume, there is an early social constructionist view of society and politics, that what motivates people isn’t essentialism. This puts a different spin on things, as Hume’s writings were widely read during the revolutionary era when the United States was founded. Thomas Jefferson, among others, was familiar with Hume and highly recommended his work. Hume represented the opposite position to John Locke. We are now returning to this old battle of ideas.

Health From Generation To Generation

Traveling around the world, Weston A. Price visited numerous traditional communities. Some of them hunter-gatherers and others agricultural, including some rural communities in Europe. This was earlier last century when industrialization had yet to take hold in most places, a very different time in terms of diet, even in the Western world.

What he found was how healthy these people were, whether they consumed more or less meat, dairy or not — although none were vegetarian (the typical pre-agricultural diet was about 1/3 to 2/3 animal products, often a large part of it saturated fat). The commonality is that they ate nutrient-dense foods, much of it raw, fermented, or prepared traditionally (the singlemost nutrient-dense food is organ meats). As a dentist, the first thing Price looked for was dental health. A common feature of these traditional societies was well-developed jaws and bone structure, straight uncrowded teeth, few cavities facial symmetry, etc. These people never saw a dentist or orthodontist, didn’t brush or floss, and yet their teeth were in excellent condition into old age.

This obviously was not the case with Price’s own American patients that didn’t follow a traditional diet and lifestyle. And when he visited prisons, he found that bone development and dental health was far worse, as indicators of worse general health and by implication worse neurocognitive health (on a related note, testing has shown that prisoners have higher rates of lead toxicity, which harms health in diverse ways). Between malnutrition and toxicity, it is unsurprising that there are so many mentally ill people housed in prisons, especially after psychiatric institutions were closed down.

Another early figure in researching diet and health was Francis M. Pottenger Jr, an American doctor. While working as a full-time assistant at a sanatorium, he did a study on cats. He fed some cats a raw food diet, some a cooked food diet, and another group got some of both. He also observed that the cooked food diet caused developmental problems of bone and dental structure. The results were worse than that, though. For the cats fed cooked food, the health of the next generation declined even further. By the third generation, they didn’t reach adulthood. There was no generation after that.

I was reading about this at work. In my normal excitement about learning something new, I shared this info with a coworker, a guy who has some interest in health but is a conventional thinker. He immediately looked for reasons for why it couldn’t be true, such as claiming that the generations of cats kept as pets disproves Pottenger’s observations. Otherwise, so the argument goes, domestic cats would presumably have gone extinct by now.

That was easy to counter, considering most pets are born strays who ate raw food or born to parents who were strays. As for purebred cats, I’m sure breeders have already figured out that a certain amount of raw food (or supplementation of enzymes, microbes, etc that normally would be found in raw food) is necessary for long term feline health. Like processed human food, processed pet food is heavily fortified with added nutrients, which likely counteracts some of the negative consequences to a cooked food diet. Pottenger’s cats weren’t eating fortified cooked food, but neither were the cats fed raw food getting any extra nutrients.

The thing is that prior to industrialization food was never fortified. All the nutrients humans (and cats) needed to not only survive but thrive was available in a traditional/natural diet. The fact that we have to fortify foods and take multivitamins is evidence of something severely wrong with the modern, industrialized food system. But that only lessens the health problems slightly. As with Pottenger’s cats, even the cats on a cooked food diet who had some raw food added didn’t avoid severely decreased health. Considering the emerging health crisis, the same appears to be true of humans.

The danger we face is that the effects are cumulative across the generations, the further we get from a traditional diet. We are only now a few generations into the modern Western diet. Most humans were still consuming raw milk and other traditional foods not that long ago. Earlier last century, the majority of Americans were rural and had access to fresh organic food from gardens and farms, including raw milk from pastured cows and fertile eggs from pastured chickens (pastured meaning high in omega-3s).

Even living in a large city, one of my grandfathers kept rabbits and chickens for much of his life and kept a garden into his old age. That means my mother was raised with quite a bit of healthy food, as was my father living in a small town surrounded by farms. My brothers and I are the first generation in our family to eat a fully modern industrialized diet from childhood. And indeed, we have more mental/neurocognitive health problems than the generations before. I had a debilitating learning disorder diagnosed in elementary school and severe depression clearly showing in 7th grade, one brother had stuttering and anxiety attacks early on, and my oldest brother had severe allergies in childhood that went untreated for years and since then has had a host of ailments (also, at least one of my brothers and I have suspected undiagnosed Asperger’s or something like that, but such conditions weren’t being diagnosed when we were in school). One thing to keep in mind is that my brothers and I are members of the generation that received one of the highest dosages of lead toxicity in childhood, prior to environmental regulations limiting lead pollution; and research has directly and strongly correlated that to higher rates of criminality, suicide, homicide, aggressive behavior, impulse control problems, lowered IQ, and stunted neurocognitive development (also many physical health conditions).

The trend of decline seems to be continuing. My nieces and nephews eat almost nothing but heavily processed foods, way more than my brothers and I had in our own childhoods, and the produce they do eat is mostly from nutrient-depleted soil, along with being filled with farm chemicals and hormones — all of this having continuously worsened these past decades. They are constantly sick (often every few weeks) and, even though still in grade school, all have multiple conditions such as: Asperger’s, learning disorder, obsessive-compulsion, failure to thrive, asthma, joint pain, etc.

If sugar was heroin, my nephew could be fairly called a junky (regularly devouring bags of candy and on more than one occasion eating a plain bowl of sugar; one step short of snorting powdered sugar and mainlining high fructose corn syrup). And in making these observations, I speak from decades of experience as a junkfood junky, most of all a sugar addict, though never quite to the same extreme. My nieces too have a tremendous intake of sugar and simple carbs, as their families’ vegetarianism doesn’t emphasize vegetables (since going on the paleo diet, I’ve been eating more organic nutrient-dense vegetables and other wholesome foods than my brothers and their families combined) — yet their diet fits well into the Standard American Diet (SAD) and, as the USDA suggests, they get plenty of grains. I wouldn’t be surprised if one or all of them already has pre-diabetes and likely will get diabetes before long, as is becoming common in their generation. The body simply can only take so much harm. I know the damage done to my own body and mind from growing up in this sick society and I hate to see even worse happening to the generations following.

To emphasize this point, the testing of newborn babies in the United States shows that they’ve already accumulated on average more than 200 synthetic chemicals from within the womb; and then imagine all the further chemicals they get from the breast milk of their unhealthy mothers along with all kinds of crap in formulas and in their environments (e.g., carcinogenic fire retardants that they breathe 24/7). Lead toxicity has decreased since my own childhood and that is a good thing, but thousands of new toxins and other chemicals have replaced it. On top of that, the hormones, hormone mimics, and hormone disruptors add to dysbiosis and disease — some suggesting this is a cause of puberty’s greater variance than in past generations, either coming earlier or later depending on gender and other factors (maybe partly explaining the reversal and divergence of educational attainment for girls and boys). Added to this mix, this is the first generation of human guinea pigs to be heavily medicated from childhood, much of it medications that have been shown to permanently alter neurocognitive development.

A major factor in many modern diseases is inflammation. This has many causes from leaky gut to toxicity, the former related to diet and often contributing to the latter (in how the leaky gut allows molecules to more easily cross the gut lining and get into the bloodstream where they can freely travel throughout the body — causing autoimmune disorders, allergies, asthma, rheumatoid arthritis, depression, etc). But obesity is another main cause of inflammation. And one might note that, when the body is overloaded and not functioning optimally, excess toxins are stored in fat cells — which makes losing weight even more difficult as toxins are released back into the body, and if not flushed out causing one to feel sick and tired.

It’s not simply bad lifestyle choices. We are living in unnatural and often outright toxic conditions. Many of the symptoms that we categorize as diseases are the bodies attempt to make the best of a bad situation. All of this adds up to a dysfunctional level across society. Our healthcare system is already too expensive for most people to afford. And the largest part of public funding for healthcare is going to diabetes alone. But the saddest part is the severe decrease in quality of life, as the rate of mood and personality disorders skyrockets. It’s not just diet. For whatever reason (toxins? stress?), with greater urbanization has come greater levels of schizophrenia and psychosis. And autism, a rare condition in the past, has become highly prevalent (by the way, one of the proven effective treatments for autism is a paleo/keto diet; also effective for autoimmune conditions among much else).

It’s getting worse and worse, generation after generation. Imagine what this means in terms of epigenetics and transgenerational trauma, as nutritional deficits and microbiotic decimation accumulates, exacerbated by a society driven mad through inequality and instability, stress and anxiety. If not for nutrients added to our nutrient poor food and supplements added to our unhealthy diet, we’d already be dying out as a society and our civilization would’ve collapsed along with it (maybe similar to how some conjecture the Roman Empire weakened as lead toxicity increased in the population). Under these conditions, that children are our future may not be an affirmation of hope. Nor may these children be filled with gratitude once they’ve reached adulthood and come to realize what we did to them and the world we left them. On the other hand, we aren’t forced to embrace fatalism and cynicism. We already know what to do to turn around all of these problems. And we don’t lack the money or other resources to do what needs to be done. All that we are waiting for is public demand and political will, although that might first require our society reaching a point of existential crisis… we are getting close.

The stumbling block is that there is no profit in the ‘healthcare’ industry for advocating, promoting, incentivizing, and ensuring healthy diet and healthy conditions for a healthy population. Quite the opposite. If disease profiteering was made illegal, there would be trillions of dollars of lost profit every year. Disease is the reality of capitalist realism, a diseased economic system and social order. This collective state of sickliness has become the norm and vested interests will go to great lengths to defend the status quo. But for most who benefit from the dysfunctional and destructive system, they never have to give it much thought. When my mother brought my nephew to the doctor, she pointed out how he is constantly sick and constantly eating a poor diet. The doctor’s response was that this was ‘normal’ for kids (these days), which might be true but the doctor should be shocked and shamed by his own admission. As apathy takes hold and we lose a sense of hope, low standards fall ever lower.

We can’t rely upon the established authority figures in seeking better health for ourselves, our families, and our communities. We know what we need to do. It might not be easy to make such massive changes when everything in society is going against you. And no doubt it is more expensive to eat healthy when the unhealthiest foods (e.g., high fructose corn syrup) are being subsidized by the government. It’s no accident that buying off the dollar menu at a fast food is cheaper than cooking a healthy meal at home. Still, if you are willing to go to the effort (and it is worth the effort), a far healthier diet is possible for many within a limited budget. That is assuming you don’t live in a food desert. But even in that case, there is a movement to create community gardens in poor neighborhoods, people providing for themselves what neither the government nor economy will provide.

Revolutions always begin from the bottom up. Or failing that, the foundations of our society will crumble, as the health of our citizenry declines. It’s a decision we must make, individually and collectively. A choice between two divergent paths leading to separate possible futures. As we have so far chosen suicidal self-destruction, we remain free to choose the other option. As Thomas Paine said, “We have it in our power to begin the world over again.”

* * *

Primal Nutrition
by Ron Schmid, ND
pp. 99-100

Parallels Between Pottenger’s and Price’s Work

While the experiments of McCarrison and Pottenger show the value of raw foods in keeping animals remarkably healthy, one might wonder about the relevance to human needs. Cats are carnivores, humans omnivores, and while the animals’ natural diet is raw, humans have cooked some foods for hundreds of thousands of years. But humans, cats, and guinea pigs are all mammals. And while the human diet is omnivorous, foods of animal origin (some customarily eaten raw) have always formed a substantial and essential part of it.

Problems in cats eating cooked foods provided parallels with the human populations Weston Price studied; the cats developed the same diseases as humans eating refined foods. The deficient generation of cats developed the same dental malformations that children of people eating modernized foods developed, including narrowing of dental arches with attendant crowding of teeth, underbites and overbites, and protruding and crooked teeth. The shape of the cat’s skull and even the entire skeleton became abnormal in severe cases, with concomitant marked behavioral changes.

Price observed these same physical and behavioral changes in both native and modern cultures eating refined foods. These changes accompanied the adoption by a culture of refined foods. In native cultures eating entirely according to traditional wisdom resulted in strength of character and relative freedom from the moral problems of modern cultures. In modern cultures, studies of populations of prisons, reformatories, and homes for the mentally delayed revealed that a large majority of individuals residing there (often approaching 100 percent) had marked abnormalities of the dental arch, often with accompanying changes in the shape of the skull.

This was not coincidence; thinking is a biological process, and abnormal changes in the shape of the skull from one generation to the next can contribute to changes in brain functions and thus in behavior. The behavioral changes in deficient cats were due to changes in nutrition. This was the only variable in Pottenger’s carefully controlled experiments. As with physical degenerative changes, parallels with human populations cannot help but suggest themselves, although the specific nature of the relationship is beyond the scope of this discussion.

Human beings do not have the same nutritional requirements as cats, but whatever else each needs, there is strong empirical evidence that both need a significant amount of certain high-quality raw foods to reproduce and function efficiently.

What is a gene?

Now: The Rest of the Genome
by Carl Zimmer

In this jungle of invading viruses, undead pseudogenes, shuffled exons and epigenetic marks, can the classical concept of the gene survive? It is an open question, one that Dr. Prohaska hopes to address at a meeting she is organizing at the Santa Fe Institute in New Mexico next March.

In the current issue of American Scientist, Dr. Gerstein and his former graduate student Michael Seringhaus argue that in order to define a gene, scientists must start with the RNA transcript and trace it back to the DNA. Whatever exons are used to make that transcript would constitute a gene. Dr. Prohaska argues that a gene should be the smallest unit underlying inherited traits. It may include not just a collection of exons, but the epigenetic marks on them that are inherited as well.

These new concepts are moving the gene away from a physical snippet of DNA and back to a more abstract definition. “It’s almost a recapture of what the term was originally meant to convey,” Dr. Gingeras said.

A hundred years after it was born, the gene is coming home.

Genome 2.0: Mountains Of New Data Are Challenging Old Views
by Patrick Barry

This complex interweaving of genes, transcripts, and regulation makes the net effect of a single mutation on an organism much more difficult to predict, Gingeras says.

More fundamentally, it muddies scientists’ conception of just what constitutes a gene. In the established definition, a gene is a discrete region of DNA that produces a single, identifiable protein in a cell. But the functioning of a protein often depends on a host of RNAs that control its activity. If a stretch of DNA known to be a protein-coding gene also produces regulatory RNAs essential for several other genes, is it somehow a part of all those other genes as well?

To make things even messier, the genetic code for a protein can be scattered far and wide around the genome. The ENCODE project revealed that about 90 percent of protein-coding genes possessed previously unknown coding fragments that were located far from the main gene, sometimes on other chromosomes. Many scientists now argue that this overlapping and dispersal of genes, along with the swelling ranks of functional RNAs, renders the standard gene concept of the central dogma obsolete.

Long Live The Gene

Offering a radical new conception of the genome, Gingeras proposes shifting the focus away from protein-coding genes. Instead, he suggests that the fundamental units of the genome could be defined as functional RNA transcripts.

Since some of these transcripts ferry code for proteins as dutiful mRNAs, this new perspective would encompass traditional genes. But it would also accommodate new classes of functional RNAs as they’re discovered, while avoiding the confusion caused by several overlapping genes laying claim to a single stretch of DNA. The emerging picture of the genome “definitely shifts the emphasis from genes to transcripts,” agrees Mark B. Gerstein, a bioinformaticist at Yale University.

Scientists’ definition of a gene has evolved several times since Gregor Mendel first deduced the idea in the 1860s from his work with pea plants. Now, about 50 years after its last major revision, the gene concept is once again being called into question.

Theory Suggests That All Genes Affect Every Complex Trait
by Veronique Greenwood

Over the years, however, what scientists might consider “a lot” in this context has quietly inflated. Last June, Pritchard and his Stanford colleagues Evan Boyle and Yang Li (now at the University of Chicago) published a paper about this in Cell that immediately sparked controversy, although it also had many people nodding in cautious agreement. The authors described what they called the “omnigenic” model of complex traits. Drawing on GWAS analyses of three diseases, they concluded that in the cell types that are relevant to a disease, it appears that not 15, not 100, but essentially all genes contribute to the condition. The authors suggested that for some traits, “multiple” loci could mean more than 100,000. […]

For most complex conditions and diseases, however, she thinks that the idea of a tiny coterie of identifiable core genes is a red herring because the effects might truly stem from disturbances at innumerable loci — and from the environment — working in concert. In a new paper out in Cell this week, Wray and her colleagues argue that the core gene idea amounts to an unwarranted assumption, and that researchers should simply let the experimental data about particular traits or conditions lead their thinking. (In their paper proposing omnigenics, Pritchard and his co-authors also asked whether the distinction between core and peripheral genes was useful and acknowledged that some diseases might not have them.)

Two Views of Present Christianity

First, everyone can be skeptical of science, including of course scientists themselves — after all, scientists are skeptics by profession. But skepticism pushed toward extreme denialism is mostly limited to the political right, some scientific issues standing out (e.g., climate change). And general distrust of science is broadly and consistently found only among religious conservatives.

This is a point that was made by Chris Mooney in his research showing that there is no equivalent on the political left — as far as I know, not even among the religious left. For example, the smart idiot effect is primarily found on the political right, such that knowledge really does matter to those on the political left (research shows that liberals, unlike conservatives, will more likely change their mind when they learn new info).

The role religion plays is in magnifying this difference between ideological tendencies.

Not All Skepticism Is Equal: Exploring the Ideological Antecedents of Science Acceptance and Rejection
by Bastiaan T. Rutjens, Robbie M. Sutton, & Romy van der Lee

To sum up the current findings, in four studies, both political conservatism and religiosity independently predict science skepticism and rejection. Climate skepticism was consistently predicted by political conservatism, vaccine skepticism was consistently predicted by religiosity, and GM food skepticism was consistently predicted by low faith in science and knowledge of science. General low faith in science and unwillingness to support science in turn were primarily associated with religiosity, in particular religious conservatism. Thus, different forms of science acceptance and rejection have different ideological roots, although the case could be made that these are generally grounded in conservatism.

Study: Conservatives’ Trust In Science At Record Low
by Eyder Peralta

While trust in science has remained flat for most Americans, a new study finds that for those who identify as conservatives trust in science has plummeted to its lowest level since 1974.

Gordon Gauchat, a sociology professor at the University of North Carolina at Chapel Hill, studied data from the General Social Survey and found that changes in confidence in science are not uniform across all groups.

“Moreover, conservatives clearly experienced group-specific declines in trust in science over the period,” Gauchat reports. “These declines appear to be long-term rather than abrupt.”

Just 35 percent of conservatives said they had a “great deal of trust in science” in 2010. That number was 48 percent in 1974. […]

Speaking to Gauchat, he said that what surprised him most about his study is that he ran statistical analysis on a host of different groups of people. He only saw significant change in conservatives and people who frequently attend church.

Gauchat said that even conservatives with bachelor’s degrees expressed distrust in science.

I asked him what could explain this and he offered two theories: First that science is now responsible for providing answers to questions that religion used to answer and secondly that conservatives seem to believe that science is now responsible for policy decisions. […]

Another bit of surprising news from the study, said Gauchat, is that trust in science for moderates has remained the same.

Here is the second point, which is more positive.

Religious conservatives are a shrinking and aging demographic, as liberal and left-wing views and labels continually take hold. So, as their numbers decrease and their influence lessens, we Americans might finally be able to have rational public debate about science that leads to pragmatic implementation of scientific knowledge.

The old guard of reactionaries are losing their grip on power, even within the once strong bastions of right-wing religiosity. But like an injured and dying wild animal, they will make a lot of noise and still can be dangerous. The reactionaries will become more reactionary, as we have recently seen. This moment of conflict shall pass, as it always does. Like it or not, change will happen and indeed it already is happening.

There is one possible explanation for this change. Science denialism is a hard attitude to maintain over time, even with the backfire effect. It turns out that even conservatives do change their opinions based on expert knowledge, even if it takes longer. So, despite the evidence showing no short term change with policies, we should expect that a political shift will continue happen across the generations.

Knowledge does matter. But it requires immense repetition and patience. Also, keep in mind that, as knowledge matters even more for the political left, the power of knowledge will increase as the general population moves further left. This might be related to the fact that the average American is increasingly better educated — admittedly, Americans aren’t all that well educated in comparison to some countries, but in comparison to the state of education in the past there has been a dramatic improvement.

However you wish to explain it, the religious and non-religious alike are becoming more liberal and progressive, even more open to social democracy and democratic socialism. There is no evidence that this shift has stopped or reversed. Conservatism will remain a movement in the future, but it will probably look more like the present Democratic Party than the present Republican Party. As the political parties have gone far right, the American public has moved so far left as to be outside of the mainstream spectrum of partisan politics.

We are beginning to see the results.

Pro-Life, Pro-Left
by Molly Worthen
(see Evangelicals Turn Left)

70 percent of evangelicals now tell pollsters they don’t identify with the religious right, and younger evangelicals often have more enthusiasm for social justice than for the culture wars

Trump Is Bringing Progressive Protestants Back to Church
by Emma Green

In the wake of Donald Trump’s election, some conservative Christians have been reckoning with feelings of alienation from their peers, who generally voted for Trump in strong numbers. But at least some progressive Protestant churches are experiencing the opposite effect: People have been returning to the pews.

“The Sunday after the election was the size of an average Palm Sunday,” wrote Eric Folkerth, the senior pastor at Dallas’s Northaven United Methodist Church, in an email. More than 30 first-time visitors signed in that day, “which is more than double the average [across] three weeks of a typical year,” he added. “I sincerely don’t recall another time when it feels like there has been a sustained desire on people’s part to be together with other progressive Christians.”

Anecdotal evidence suggests other liberal churches from a variety of denominations have been experiencing a similar spike over the past month, with their higher-than-usual levels of attendance staying relatively constant for several weeks. It’s not at all clear that the Trump bump, as the writer Diana Butler Bass termed it in a conversation with me, will be sustained beyond the first few months of the new administration. But it suggests that some progressives are searching for a moral vocabulary in grappling with the president-elect—including ways of thinking about community that don’t have to do with electoral politics. […]

Even if Trump doesn’t bring about a membership revolution in the American mainline, which has been steadily shrinking for years, some of the conversations these Protestant pastors reported were fascinating—and suggest that this political environment might be theologically, morally, and intellectually generative for progressive religious traditions.

Southern Baptists Call Off the Culture War
by Jonathan Merritt

Indeed, disentangling the SBC from the GOP is central to the denomination’s makeover. For example, a motion to defund the ERLC in response to the agency’s full-throated opposition to Donald Trump failed miserably.

In years past, Republican politicians have spoken to messengers at the annual meeting. In 1991, President George H.W. Bush addressed the group, Vice President Dan Quayle spoke in 1992, and President George W. Bush did so in 2001 and 2002 (when my father, James Merritt, was SBC president). Neither President Bill Clinton nor President Barack Obama were invited to speak to Southern Baptists during their terms. Though Southern Baptists claim not to be affiliated with either major party, it’s not difficult to discern the pattern at play.

Vice President Mike Pence addressed the convention this year, which may seem like the same old song to outsiders. But there was widespread resistance to Pence’s participation. A motion to disinvite the vice president was proposed and debated, but was ultimately voted down. During his address, which hit some notes more typical of a campaign speech, a few Southern Baptists left the room out of protest. Others criticized the move to reporters or spoke out on Twitter. The newly elected Greear tweeted that the invitation “sent a terribly mixed signal” and reminded his fellow Baptists that “commissioned missionaries, not political platforms, are what we do.”

Though most Southern Baptists remain politically conservative, it seems that some are now less willing to have their denomination serve as a handmaiden to the GOP, especially in the current political moment. They appear to recognize that tethering themselves to Donald Trump—a thrice-married man who has bragged about committing adultery, lies with impunity, allegedly paid hush money to a porn star with whom he had an affair, and says he has never asked God for forgiveness—places the moral credibility of the Southern Baptist Convention at risk.

By elevating women and distancing themselves from partisan engagement, the members of the SBC appear to be signaling their determination to head in a different direction, out of a mix of pragmatism and principle.

For more than a decade, the denomination has been experiencing precipitous decline by almost every metric. Baptisms are at a 70-year low, and Sunday attendance is at a 20-year low. Southern Baptist churches lost almost 80,000 members from 2016 to 2017 and they have hemorrhaged a whopping one million members since 2003. For years, Southern Baptists have criticized more liberal denominations for their declines, but their own trends are now running parallel. The next crop of leaders knows something must be done.

“Southern Baptists thought that if they became more conservative, their growth would continue unabated. But they couldn’t outrun the demographics and hold the decline at bay,” said Leonard. “Classic fundamentalist old-guard churches are either dead or dying, and the younger generation is realizing that the old way of articulating the gospel is turning away more people than it is attracting. “

Regardless of their motivations, this shift away from a more culturally strident and politically partisan stance is significant.

As the late pastor Adrian Rogers said at the 2002 SBC annual meeting in St. Louis, “As the West goes, so goes the world. As America goes, so goes the West. As Christianity goes, so goes America. As evangelicals go, so goes Christianity. As Southern Baptists go, so go evangelicals.”

Rogers may have had an inflated sense of the denomination’s importance, but the fact remains that what happens in the SBC often ripples across culture. In Trump’s America, where the religious right wields outsized influence, the shifts among Southern Baptists could be a harbinger of broader change among evangelicals.

The divide between the religious and the rest of the population is smaller than it seems. That is because media likes to play up conflict. To demonstrate the actual views of the religious in the United States, consider a hot button issue like abortion:

  • “As an example of the complexity, data shows that there isn’t even an anti-abortion consensus among Christians, only one Christian demographic showing a strong majority [White Evangelical Protestants].” (Claims of US Becoming Pro-Life)
  • “[A]long with most doctors, most church-going Catholics support public option and so are in agreement with most Americans in general. Even more interesting is the fact that the church-going Catholics even support a national plan that includes funding for abortion.” (Health Reform & Public Option (polls & other info))
  • “[M]ost Americans identify as Christian and have done so for generations. Yet most Americans are pro-choice, supporting abortion in most or all situations, even as most Americans also support there being strong and clear regulations for where abortions shouldn’t be allowed. It’s complicated, specifically among Christians. The vast majority (70%) seeking abortions considered themselves Christians, including over 50% who attend church regularly having kept their abortions secret from their church community and 40% feeling that churches are not equipped to help them make decisions about unwanted pregnancies.” (American Christianity: History, Politics, & Social Issues)

Whatever ideological and political conflicts we might have in the future, it won’t be a continuation of the culture wars we have known up to this point. Nor will it likely conform to battle of ideologies as seen during the Cold War. The entire frame of debate will be different and, barring unforeseen events, most likely far to the left.

* * *

As an additional point, there is another shift that is happening. There is a reason why there feels to be a growing antagonism, even though it’s not ideological per se.

The fact of the matter is “religious nones” (atheists, agnostics, religiously non-identifying, religiously indifferent, etc) is growing faster than any religious group. Mainline Christians have been losing membership for decades and now so are Evangelicals. This is getting to the point where young Americans are evenly split between the religious and non-religious. That means the religious majority will quickly disappear.

This isn’t motivated by overt ideology or it doesn’t seem to be, since it is a shift happening in many other countries as well. But it puts pressure on ideology and can get expressed or manipulated through ideological rhetoric. So, we might see increasing conflict between ideologies, maybe in new forms that could create a new left vs right.

Younger people are less religious than older ones in many countries, especially in the U.S. and Europe
by Stephanie Kramer & Dalia Fahmy

In the U.S., the age gap is considerable: 43% of people under age 40 say religion is very important to them, compared with 60% of adults ages 40 and over.

If nothing else, this contributes to a generational conflict. There is a reason much of right-wing media has viewers that are on average older. This is why many older Americans are still fighting the culture wars, if only in their own minds.

But Americans in general, including most young Evangelicals, have lost interest in politicized religion. Christianity simply won’t play the same kind of central role in coming decades. Religion will remain an issue, but even Republicans will have to deal with the fact that even the young on the political right are less religious and less socially conservative.

Are Wrens Smarter Than Racists?

Race realists and racial supremacists have many odd notions. For one, they believe humans are separate species, despite all the evidence to the contrary (e.g., unusually low genetic diversity as compared similar species; two random humans are more likely to be genetically similar than two random chimpanzees).

But an even stranger belief is that humans, despite being such a highly social species, are assumed to be incapable of cooperating with other humans who are perceived as different based on modern social constructions of ‘race’. Yet, even ignoring the fact that all humans are of the same species, numerous other species cooperate all the time across large genetic divides. This includes the development of close relationships between individuals of separate species.

So, why do racists believe that ‘white’ Americans and ‘black’ Americans must be treated as separate species and be inevitably segregated in different communities and countries? That particularly doesn’t make sense considering most so-called African-Americans are significantly of European ancestry, not to mention a surprising number of supposed European-Americans in the South that have non-European genetics (African, Native American, etc).

Wrens don’t let racism get in the way of promoting their own survival through befriending other species who share their territory. Do human racists think they have less cognitive capacity than wrens? If that is their honest assessment of their own abilities, that is fine. But why do they assume everyone else is as deficient as they are?

* * *

Birds from different species recognize each other and cooperate
by Matt Wood, University of Chicago

 

Cooperation among different species of birds is common. Some birds build their nests near those of larger, more aggressive species to deter predators, and flocks of mixed species forage for food and defend territories together in alliances that can last for years. In most cases, though, these partnerships are not between specific individuals of the other species—any bird from the other species will do.

But in a new study published in the journal Behavioral Ecology, scientists from the University of Chicago and University of Nebraska show how two different species of Australian fairy-wrens not only recognize individual birds from other species, but also form long-term partnerships that help them forage and defend their shared space as a group.

“Finding that these two species associate was not surprising, as mixed species flocks of birds are observed all over the world,” said Allison Johnson, PhD, a postdoctoral scholar at the University of Nebraska who conducted the study as part of her dissertation research at UChicago. “But when we realized they were sharing territories with specific individuals and responding aggressively only to unknown individuals, we knew this was really unique. It completely changed our research and we knew we had to investigate it.”

Variegated fairy-wrens and splendid fairy-wrens are two small songbirds that live in Australia. The males of each species have striking, bright blue feathers that make them popular with bird watchers. Their behavior also makes them an appealing subject for biologists. Both species feed on insects, live in large family groups, and breed during the same time of year. They are also non-migratory, meaning they live in one area for their entire lives, occupying the same eucalyptus scrublands that provide plenty of bushes and trees for cover.

When these territories overlap, the two species interact with each other. They forage together, travel together, and seem to be aware of what the other species is doing. They also help each other defend their territory from rivals. Variegated fairy-wrens will defend their shared territory from both variegated and splendid outsiders; splendid fairy-wrens will do the same, while fending off unfamiliar birds from both species.

“Splendid and variegated fairy-wrens are so similar in their habitat preferences and behavior, we would expect them to act as competitors. Instead, we’ve found stable, positive relationships between individuals of the two species,” said Christina Masco, PhD, a graduate student at UChicago and a co-author on the new paper.

Epigenetic Memory and the Mind

Epigenetics is fascinating, even bizarre by conventional thought. Some worry that it’s another variety of determinism, just not located in the genes. I have other worries, if not that particular one.

How epigenetics work is that a gene gets switched on or off. The key point is that it’s not permanently set. Some later incident, conditions, behavior, or whatever can switch it back the other way again. Genes in your body are switched on and off throughout your lifetime. But presumably if no significant changes occur in one’s life some epigenetic expressions remain permanently set for your entire life.

Where it gets fascinating is that it’s been proven that epigenetics gets passed on across multiple generations and no one is certain how many generations. In mice, it can extend at least upwards of 7 generations or so, as I recall. Humans, of course, haven’t been studied for that many generations. But present evidence indicates it operates similarly in humans.

Potentially, all of the major tragedies in modern history (violence of colonialism all around the world, major famines in places like Ireland and China, genocides in places like the United States and Rwanda, international conflicts like the world wars, etc), all of that is within the range of epigenetis. It’s been shown that famine, for example, switches genes for a few generations that causes increased fat retention and in the modern world that means higher obesity rates.

I’m not sure what is the precise mechanism that causes genes to switch on and off (e.g., precisely how does starvation get imprinted on biology and become set that way for multiple generations). All I know is it has to do with the proteins that encase the DNA. The main interest is that, once we do understand the mechanism, we will be able to control the process. This might be a way of preventing or managing numerous physical and psychiatric health conditions. So, it really will mean the opposite of determinism.

This research reminds me of other scientific and anecdotal evidence. Consider the recipients of organ transplants, blood and bone marrow transfusions, and microbiome transference. This involves the exchange of cells from one body to another. The results have shown changes in mood, behavior, biological functioning, etc

For example, introducing a new microbiome can make a skinny rodent fat or a fat rodent skinny. But also observed are shifts in fairly specific memories, such as an organ transplant recipient craving something the organ donor craved. Furthermore, research has shown that genetics can jump from the introduced cells to the already present cells, which is how a baby can potentially end up with the cells of two fathers if a previous pregnancy was by a different father, and actually it’s rather common for people to have multiple DNAs in their body.

It intuitively makes sense that epigenetics would be behind memory. It’s easy to argue that there is no other function in the body that has this kind and degree of capacity. And that possibility would blow up our ideas of the human mind. In that case, some element of memories would get passed on multiple generations, explaining certain similarities seen in families and larger populations with shared epigenetic backgrounds.

This gives new meaning to the theories of both the embodied mind and the extended mind. There might also having some interesting implications for the bundle theory of mind. I wonder too about something like enactivism which is about the human mind’s relation to the world. Of course, there are obvious connections of this specific research with neurological plasticity and of epigenetics more generally with intergenerational trauma.

So, it wouldn’t only be the symptoms of trauma or else the benefits of privilege (or whatever other conditions that shape individuals, generational cohorts, and sub-populations) being inherited but some of the memory itself. This puts bodily memory in a much larger context, maybe even something along the lines of Jungian thought, in terms of collective memory and archetypes (depending on how long-lasting some epigenetic effects might be). Also, much of what people think of as cultural, ethnic, and racial differences might simply be epigenetics. This would puncture an even larger hole in genetic determinism and race realism. Unlike genetics, epigenetics can be changed.

Our understanding of so much is going to be completely altered. What once seemed crazy or unthinkable will become the new dominant paradigm. This is both promising and scary. Imagine what authoritarian governments could do with this scientific knowledge. The Nazis could only dream of creating a superman. But between genetic engineering and epigenetic manipulations, the possibilities are wide open. And right now, we have no clue what we are doing. The early experimentation, specifically research done covertly, is going to be of the mad scientist variety.

These interesting times are going to get way more interesting.

* * *

Could Memory Traces Exist in Cell Bodies?
by Susan Cosier

The finding is surprising because it suggests that a nerve cell body “knows” how many synapses it is supposed to form, meaning it is encoding a crucial part of memory. The researchers also ran a similar experiment on live sea slugs, in which they found that a long-term memory could be totally erased (as gauged by its synapses being destroyed) and then re-formed with only a small reminder stimulus—again suggesting that some information was being stored in a neuron’s body.

Synapses may be like a concert pianist’s fingers, explains principal investigator David Glanzman, a neurologist at U.C.L.A. Even if Chopin did not have his fingers, he would still know how to play his sonatas. “This is a radical idea, and I don’t deny it: memory really isn’t stored in synapses,” Glanzman says.

Other memory experts are intrigued by the findings but cautious about interpreting the results. Even if neurons retain information about how many synapses to form, it is unclear how the cells could know where to put the synapses or how strong they should be—which are crucial components of memory storage. Yet the work indeed suggests that synapses might not be set in stone as they encode memory: they may wither and re-form as a memory waxes and wanes. “The results are really just kind of surprising,” says Todd Sacktor, a neurologist at SUNY Downstate Medical Center. “It has always been this assumption that it’s the same synapses that are storing the memory,” he says. “And the essence of what [Glanzman] is saying is that it’s far more dynamic.”

Memory Transferred Between Snails, Challenging Standard Theory of How the Brain Remembers
by Usha Lee McFarling

Glanzman’s experiments—funded by the National Institutes of Health and the National Science Foundation—involved giving mild electrical shocks to the marine snail Aplysia californica. Shocked snails learn to withdraw their delicate siphons and gills for nearly a minute as a defense when they subsequently receive a weak touch; snails that have not been shocked withdraw only briefly.

The researchers extracted RNA from the nervous systems of snails that had been shocked and injected the material into unshocked snails. RNA’s primary role is to serve as a messenger inside cells, carrying protein-making instructions from its cousin DNA. But when this RNA was injected, these naive snails withdrew their siphons for extended periods of time after a soft touch. Control snails that received injections of RNA from snails that had not received shocks did not withdraw their siphons for as long.

“It’s as if we transferred a memory,” Glanzman said.

Glanzman’s group went further, showing that Aplysia sensory neurons in Petri dishes were more excitable, as they tend to be after being shocked, if they were exposed to RNA from shocked snails. Exposure to RNA from snails that had never been shocked did not cause the cells to become more excitable.

The results, said Glanzman, suggest that memories may be stored within the nucleus of neurons, where RNA is synthesized and can act on DNA to turn genes on and off. He said he thought memory storage involved these epigenetic changes—changes in the activity of genes and not in the DNA sequences that make up those genes—that are mediated by RNA.

This view challenges the widely held notion that memories are stored by enhancing synaptic connections between neurons. Rather, Glanzman sees synaptic changes that occur during memory formation as flowing from the information that the RNA is carrying.

Stress Is Real, As Are The Symptoms

I was reading a book, Strange Contagion by Lee Daniel Kravetz, where he dismisses complaints about wind turbines (e.g. low frequency sounds). It’s actually a great read, even as I disagree with elements of it, such as his entirely overlooking of inequality as a cause of strange contagions (public hysteria, suicide clusters, etc) — an issue explored in depth by Keith Payne in The Broken Ladder and briefly touched upon by Kurt Andersen in Fantasyland.

By the way, one might note that where wind farms are located, as with where toxic dumps are located, has everything to do with economic, social, and political disparities — specifically as exacerbated by poverty, economic segregation, residential isolation, failing local economies, dying small towns, inadequate healthcare, underfunded or non-existent public services, limited coverage in the corporate media, underrepresentation in positions of power and authority, etc (many of the things that get dismissed in defense of the establishment and status quo). And one might note that the dismissiveness toward inequality problems has strong resemblances to the dismissiveness toward wind turbine syndrome or wind farm syndrome.

About wind turbines, Kravetz details the claims against them in writing that, “People closest to the four-hundred-foot-tall turrets receive more than just electricity. The turbines interrupt their sleep patterns. They also generate faint ringing in their ears. Emissions cause pounding migraine headaches. The motion of the vanes also creates a shadow flicker that triggers disorientation, vertigo, and nausea” (Kindle Locations 959-961). But he goes onto assert that the explanation of cause is entirely without scientific substantiation, even as the symptoms are real:

“Grievances against wind farms are not exclusive to DeKalb County, with a perplexing illness dogging many a wind turbine project. Similar complaints have surfaced in Canada, the UK, Italy, and various US cities like Falmouth, Massachusetts. In 2009 the Connecticut pediatrician Nina Pierpont offered an explanation. Wind turbines, she argued, produce low-frequency noises that induce disruptions in the inner ear and lead to an illness she calls wind turbine syndrome. Her evidence, now largely discredited for sample size errors, a lack of a control group, and no peer review, seemed to point to infrasound coming off of the wind farms. Since then more than a dozen scientific reviews have firmly established that wind turbines pose no unique health risks and are fundamentally safe. It doesn’t seem to matter to the residents of DeKalb County, whose symptoms are quite real.” (Kindle Locations 961-968)

He concludes that it is “wind farm hysteria”. It is one example he uses in exploring the larger issue of what he calls strange contagions, partly related to Richard Dawkin’s theory of memes, although he considers it more broadly to include the spread of not just thoughts and ideas but emotions and behaviors. Indeed, he makes a strong overall case in his book and I’m largely persuaded or rather it fits the evidence I’ve previously seen elsewhere. But sometimes his focus is too narrow and conventional. There are valid reasons to consider wind turbines as potentially problematic for human health, despite our not having precisely ascertained and absolutely proven the path of causation.

Stranger Dimensions put out an article by Rob Schwarz, Infrasound: The Fear Frequency, that is directly relevant to the issue. He writes that, “Infrasound is sound below 20 Hz, lower than humans can perceive. But just because we don’t consciously hear it, that doesn’t mean we don’t respond to it; in certain individuals, low-frequency sound can induce feelings of fear or dread or even depression. […] In humans, infrasound can cause a number of strange, seemingly inexplicable effects: headaches, nausea, night terrors and sleep disorders.”

Keep in mind that wind turbines do emit infrasound. The debate has been on whether infrasound can cause ‘disease’ or mere irritation and annoyance. This is based on a simplistic and uninformed understanding of stress. A wide array of research has already proven beyond any doubt that continuous stress is a major contributing factor to numerous physiological and psychological health conditions, and of course this relates to high levels of stress in high inequality societies. In fact, background stress when it is ongoing, as research shows, can be more traumatizing over the long-term than traumatizing events that are brief. Trauma is simply unresolved stress and, when there are multiple stressors in one’s environment, there is no way to either confront it or escape it. It is only some of the population that suffers from severe stress, because of either a single or multiple stressors, but stress in general has vastly increased — as Kravetz states in a straightforward manner: “Americans, meanwhile, continue to experience more stress than ever, with one study I read citing an increase of more than 1,000 percent in the past three decades” (Kindle Locations 2194-2195).

The question isn’t whether stress is problematic but how stressful is continuous low frequency sound, specifically when combined with other stressors as is the case for many disadvantaged populations near wind farms — plus, besides infrasound, wind turbines are obtrusive with blinking lights along with causing shadow flicker and rhythmic pressure pulses on buildings. No research so far has studied the direct influence of long-term, even if low level, exposure to multiple and often simultaneous stressors and so there is no way for anyone to honestly conclude that wind turbines aren’t significantly contributing to health concerns, at least for those already sensitized or otherwise in a state of distress (which would describe many rural residents near wind farms, considering communities dying and young generations leaving, contributing to a loss of social support that otherwise would lessen the impact of stress). Even the doubters admit that it has been proven that wind turbines cause annoyance and stress, the debate being over how much and what impact. Still, that isn’t to argue against wind power and for old energy industries like coal, but maybe wind energy technology could be improved which would ease our transition to alternative energy.

It does make one wonder what we don’t yet understand about how not easily observed factors can have significant influence over us. Human senses are severely limited and so we are largely unaware of the world around us, even when it is causing us harm. The human senses can’t detect tiny parasites, toxins, climate change, etc. And the human tendency is to deny the unknown, even when it is obvious something is going on. It is particularly easy for those not impacted to dismiss those impacted, such as middle-to-upper class citizens, corporate media, government agencies, and politicians ignoring the severe lead toxicity rates for mostly poor minorities in old industrial areas. Considering that, maybe scientists who do research and politicians who pass laws should be required to live for several years surrounded by lead toxicity and wind turbines. Then maybe the symptoms would seem more real and we might finally find a way to help those harmed, if only to reduce some of risk factors, including stress.

The article by Schwarz went beyond this. And in doing so, went in an interesting direction. He explains that, “If infrasound hits at just the right strength and frequency, it can resonate with human eyes, causing them to vibrate. This can lead to distorted vision and the possibility of “ghost” sightings. Or, at least, what some would call ghost sightings. Infrasound may also cause a person to “feel” that there’s an entity in the room with him or her, accompanied by that aforementioned sense of dread.” He describes an incident in a laboratory that came to have a reputation for feeling haunted, the oppressive atmosphere having disappeared when a particular fan was turned off. It turns out it was vibrating at just the right frequency to produce a particular low frequency sound. Now, that is fascinating.

This reminds me of Fortean observations. It’s been noted by a number of paranormal and UFO researchers, such as John Keel, that various odd experiences tend to happen in the same places. UFOs are often repeatedly sighted by different people in the same locations and often at those same locations there will be bigfoot sightings and accounts of other unusual happenings. Jacques Vallee also noted that the certain Fortean incidents tend to follow the same pattern, such as numerous descriptions of UFO abductions matching the folktales about fairy abductions and the anthropological literature on shamanistic initiations.

Or consider what sometimes are called fairy lights. No one knows what causes them, but even scientists have observed them. There are many sites that are specifically known for their fairy lights. My oldest brother went to one of those places and indeed he saw the same thing that thousands of others had seen. The weird thing about these balls of light is it is hard to discern exactly where they are in terms of distance from you, going from seeming close to seeming far. It’s possible that there is nothing actually there and instead it is some frequency affecting the brain.

Maybe there is a diversity of human experiences that have common mechanisms or involve overlapping factors. In that case, we simply haven’t yet figured them out yet. But improved research methods might allow us to look more closely at typically ignored and previously unknown causes. Not only might this lead to betterment for the lives of many but also greater understanding of the human condition.

Fantasyland, An American Tradition

“The American experiment, the original embodiment of the great Enlightenment idea of intellectual freedom, every individual free to believe anything she wishes, has metastasized out of control. From the start, our ultra-individualism was attached to epic dreams, sometimes epic fantasies—every American one of God’s chosen people building a custom-made utopia, each of us free to reinvent himself by imagination and will. In America those more exciting parts of the Enlightenment idea have swamped the sober, rational, empirical parts.”
~ Kurt Andersen, Fantasyland

It’s hard to have public debate in the United States for a number of reasons. The most basic reason is that Americans are severely uninformed and disinformed. We also tend to lack a larger context for knowledge. Historical amnesia is rampant and scientific literacy is limited, exacerbated by centuries old strains of anti-intellectualism and dogmatic idealism, hyper-individualism and sectarian groupthink, public distrust and authoritarian demagoguery.

This doesn’t seem as common in countries elsewhere. Part of this is that Americans are less aware and informed about other countries than the citizens of other countries are of the United States. Living anywhere else in the world, it is near impossible to not know in great detail about the United States and other Western powers as the entire world cannot escape these influences that cast a long shadow of colonial imperialism, neoliberal globalization, transnational corporations, mass media, monocultural dominance, soft power, international propaganda campaigns during the Cold War, military interventionism, etc. The rest of the world can’t afford the luxury of ignorance that Americans enjoy.

Earlier last century when the United States was a rising global superpower competing against other rising global superpowers, the US was known for having one of the better education systems in the world. International competition motivated us in investing in education. Now we are famous for how pathetic recent generations of students compare to many other developed countries. But even the brief moment of seeming American greatness following World War II might have had more to do with the wide scale decimation of Europe, a temporary lowering of other developed countries rather than a vast improvement in the United States.

There has also been a failure of big biz mass media to inform the public and the continuing oligopolistic consolidation of corporate media into a few hands has not allowed for a competitive free market to force corporations to offer something better. On top of that, Americans are one of the most propagandized and indoctrinated populations on the planet, with only a few comparable countries such as China and Russia exceeding us in this area.

See how the near unanimity of the American mass media was able, by way of beating the war drum, to change majority public opinion from being against the Iraq War to being in support of it. It just so happens that the parent companies of most of the corporate media, with ties to the main political parties and the military-industrial complex, profits immensely from the endless wars of the war state.

Corporate media is in the business of making money which means selling a product. In late stage capitalism, all of media is entertainment and news media is infotainment. Even the viewers are sold as a product to advertisers. There is no profit in offering a public service to inform the citizenry and create the conditions for informed public debate. As part of consumerist society, we consume as we are consumed by endless fantasies, just-so stories, comforting lies, simplistic narratives, and political spectacle.

This is a dark truth that should concern and scare Americans. But that would require them to be informed first. There is the rub.

Every public debate in the United States begins with mainstream framing. It requires hours of interacting with a typical American even to maybe get them to acknowledge their lack of knowledge, assuming they have the intellectual humility that makes that likely. Americans are so uninformed and misinformed that they don’t realize they are ignorant, so indoctrinated that they don’t realize how much their minds are manipulated and saturated in bullshit (I speak from the expertise of being an American who has been woefully ignorant for most of my life). To simply get to the level of knowledge where debate is even within the realm of possibility is itself almost an impossible task. To say it is frustrating is an extreme understatement.

Consider how most Americans know that tough-on-crime laws, stop-and-frisk, broken window policies, heavy policing, and mass incarceration were the cause of decreased crime. How do they know? Because decades of political rhetoric and media narratives have told them so. Just as various authority figures in government and media told them or implied or remained silent while others pushed the lies that the 9/11 terrorist attack was somehow connected to Iraq which supposedly had weapons of mass destruction, despite that the US intelligence agencies and foreign governments at the time knew these were lies.

Sure, you can look to alternative media for regularly reporting of different info that undermines and disproves these beliefs. But few Americans get much if any of their news from alternative media. There have been at least hundreds of high quality scientific studies, careful analyses, and scholarly books that have come out since the violent crime decline began. This information, however, is almost entirely unknown to the average American citizen and one suspects largely unknown to the average American mainstream news reporter, media personality, talking head, pundit, think tank hack, and politician.

That isn’t to say there isn’t ignorance found in other populations as well. Having been in the online world since the early naughts, I’ve met and talked with many people from other countries and admittedly some of them are less than perfectly informed. Still, the level of ignorance in the United States is unique, at least in the Western world.

That much can’t be doubted. Other serious thinkers might have differing explanations for why the US has diverged so greatly from much of the rest of the world, from its level of education to its rate of violence. But one way or another, it needs to be explained in the hope of finding a remedy. Sadly, even if we could agree on a solution, those in power benefit too greatly from the ongoing state of an easily manipulated citizenry that lacks knowledge and critical thinking skills.

This isn’t merely an attack on low-information voters and right-wing nut jobs. Even in dealing with highly educated Americans among the liberal class, I rarely come across someone who is deeply and widely informed across various major topics of public concern.

American society is highly insular. We Americans are not only disconnected from the rest of the world but disconnected from each other. And so we have little sense of what is going on outside of the narrow constraints of our neighborhoods, communities, workplaces, social networks, and echo chambers. The United States is psychologically and geographically segregated into separate reality tunnel enclaves defined by region and residency, education and class, race and religion, politics and media.

It’s because we so rarely step outside of our respective worlds that we so rarely realize how little we know and how much of what we think we know is not true. Most of us live in neighborhoods, go to churches and stores, attend or send our kids to schools, work and socialize with people who are exactly like ourselves. They share our beliefs and values, our talking points and political persuasion, our biases and prejudices, our social and class position. We are hermetically sealed within our safe walled-in social identities. Nothing can reach us, threaten us, or change us.

That is until something happens like Donald Trump being elected. Then there is a panic about what has become of America in this post-fact age. The sad reality, however, is America has always been this way. It’s just finally getting to a point where it’s harder to ignore and that potential for public awakening offers some hope.

* * *

Fantasyland
by Kurt Anderson
pp. 10-14

Why are we like this?

. . . The short answer is because we’re Americans, because being American means we can believe any damn thing we want, that our beliefs are equal or superior to anyone else’s, experts be damned. Once people commit to that approach, the world turns inside out, and no cause-and-effect connection is fixed. The credible becomes incredible and the incredible credible.

The word mainstream has recently become a pejorative, shorthand for bias, lies, oppression by the elites. Yet that hated Establishment, the institutions and forces that once kept us from overdoing the flagrantly untrue or absurd—media, academia, politics, government, corporate America, professional associations, respectable opinion in the aggregate—has enabled and encouraged every species of fantasy over the last few decades.

A senior physician at one of America’s most prestigious university hospitals promotes miracle cures on his daily TV show. Major cable channels air documentaries treating mermaids, monsters, ghosts, and angels as real. A CNN anchor speculated on the air that the disappearance of a Malaysian airliner was a supernatural event. State legislatures and one of our two big political parties pass resolutions to resist the imaginary impositions of a New World Order and Islamic law. When a political scientist attacks the idea that “there is some ‘public’ that shares a notion of reality, a concept of reason, and a set of criteria by which claims to reason and rationality are judged,” colleagues just nod and grant tenure. A white woman felt black, pretended to be, and under those fantasy auspices became an NAACP official—and then, busted, said, “It’s not a costume…not something that I can put on and take off anymore. I wouldn’t say I’m African American, but I would say I’m black.” Bill Gates’s foundation has funded an institute devoted to creationist pseudoscience. Despite his nonstop lies and obvious fantasies—rather, because of them—Donald Trump was elected president. The old fringes have been folded into the new center. The irrational has become respectable and often unstoppable. As particular fantasies get traction and become contagious, other fantasists are encouraged by a cascade of out-of-control tolerance. It’s a kind of twisted Golden Rule unconsciously followed: If those people believe that , then certainly we can believe this.

Our whole social environment and each of its overlapping parts—cultural, religious, political, intellectual, psychological—have become conducive to spectacular fallacy and make-believe. There are many slippery slopes, leading in various directions to other exciting nonsense. During the last several decades, those naturally slippery slopes have been turned into a colossal and permanent complex of interconnected, crisscrossing bobsled tracks with no easy exit. Voilà: Fantasyland. . . .

When John Adams said in the 1700s that “facts are stubborn things,” the overriding American principle of personal freedom was not yet enshrined in the Declaration or the Constitution, and the United States of America was itself still a dream. Two and a half centuries later the nation Adams cofounded has become a majority-rule de facto refutation of his truism: “our wishes, our inclinations” and “the dictates of our passions” now apparently do “alter the state of facts and evidence,” because extreme cognitive liberty and the pursuit of happiness rule.

This is not unique to America, people treating real life as fantasy and vice versa, and taking preposterous ideas seriously. We’re just uniquely immersed. In the developed world, our predilection is extreme, distinctly different in the breadth and depth of our embrace of fantasies of many different kinds. Sure, the physician whose fraudulent research launched the antivaccine movement was a Brit, and young Japanese otaku invented cosplay, dressing up as fantasy characters. And while there are believers in flamboyant supernaturalism and prophecy and religious pseudoscience in other developed countries, nowhere else in the rich world are such beliefs central to the self-identities of so many people. We are Fantasyland’s global crucible and epicenter.

This is American exceptionalism in the twenty-first century. America has always been a one-of-a-kind place. Our singularity is different now. We’re still rich and free, still more influential and powerful than any nation, practically a synonym for developed country . But at the same time, our drift toward credulity, doing our own thing, and having an altogether uncertain grip on reality has overwhelmed our other exceptional national traits and turned us into a less-developed country as well.

People tend to regard the Trump moment—this post-truth, alternative facts moment—as some inexplicable and crazy new American phenomenon. In fact, what’s happening is just the ultimate extrapolation and expression of attitudes and instincts that have made America exceptional for its entire history—and really, from its prehistory. . . .

America was created by true believers and passionate dreamers, by hucksters and their suckers—which over the course of four centuries has made us susceptible to fantasy, as epitomized by everything from Salem hunting witches to Joseph Smith creating Mormonism, from P. T. Barnum to Henry David Thoreau to speaking in tongues, from Hollywood to Scientology to conspiracy theories, from Walt Disney to Billy Graham to Ronald Reagan to Oprah Winfrey to Donald Trump. In other words: mix epic individualism with extreme religion; mix show business with everything else; let all that steep and simmer for a few centuries; run it through the anything-goes 1960s and the Internet age; the result is the America we inhabit today, where reality and fantasy are weirdly and dangerously blurred and commingled.

I hope we’re only on a long temporary detour, that we’ll manage somehow to get back on track. If we’re on a bender, suffering the effects of guzzling too much fantasy cocktail for too long, if that’s why we’re stumbling, manic and hysterical, mightn’t we somehow sober up and recover? You would think. But first you need to understand how deeply this tendency has been encoded in our national DNA.

Fake News: It’s as American as George Washington’s Cherry Tree
by Hanna Rosin

Fake news. Post-truth. Alternative facts. For Andersen, these are not momentary perversions but habits baked into our DNA, the ultimate expressions of attitudes “that have made America exceptional for its entire history.” The country’s initial devotion to religious and intellectual freedom, Andersen argues, has over the centuries morphed into a fierce entitlement to custom-made reality. So your right to believe in angels and your neighbor’s right to believe in U.F.O.s and Rachel Dolezal’s right to believe she is black lead naturally to our president’s right to insist that his crowds were bigger.

Andersen’s history begins at the beginning, with the first comforting lie we tell ourselves. Each year we teach our children about Pilgrims, those gentle robed creatures who landed at Plymouth Rock. But our real progenitors were the Puritans, who passed the weeks on the trans-Atlantic voyage preaching about the end times and who, when they arrived, vowed to hang any Quaker or Catholic who landed on their shores. They were zealots and also well-educated British gentlemen, which set the tone for what Andersen identifies as a distinctly American endeavor: propping up magical thinking with elaborate scientific proof.

While Newton and Locke were ushering in an Age of Reason in Europe, over in America unreason was taking new seductive forms. A series of mystic visionaries were planting the seeds of extreme entitlement, teaching Americans that they didn’t have to study any book or old English theologian to know what to think, that whatever they felt to be true was true. In Andersen’s telling, you can easily trace the line from the self-appointed 17th-century prophet Anne Hutchinson to Kanye West: She was, he writes, uniquely American “because she was so confident in herself, in her intuitions and idiosyncratic, subjective understanding of reality,” a total stranger to self-doubt.

What happens next in American history, according to Andersen, happens without malevolence, or even intention. Our national character gels into one that’s distinctly comfortable fogging up the boundary between fantasy and reality in nearly every realm. As soon as George Washington dies fake news is born — the story about the cherry tree, or his kneeling in prayer at Valley Forge. Enterprising businessmen quickly figure out ways to make money off the Americans who gleefully embrace untruths.

Cultural Body-Mind

Daniel Everett is an expert on the Piraha, although he has studied other cultures. It’s unsurprising then to find him use the same example in different books. One particular example (seen below) is about bodily form. I bring it up becomes it contradicts much of the right-wing and reactionary ideology found in genetic determinism, race realism, evolutionary psychology, and present human biodiversity (as opposed to the earlier HBD theory originated by Jonathan Marks).

From the second book below, the excerpt is part of a larger section where Everett responded to the evolutionary psychologist John Tooby, the latter arguing that there is no such thing as ‘culture’ and hence everything is genetic or otherwise biological. Everett’s use of dark matter of the mind is his way of attempting to get at more deeply complex view. This dark matter is of the mind but also of the body.

* * *

How Language Began:
The Story of Humanity’s Greatest Invention

by Daniel L. Everett
pp. 220-221

Culture, patterns of being – such as eating, sleeping, thinking and posture – have been cultivated. A Dutch individual will be unlike the Belgian, the British, the Japanese, or the Navajo, because of the way that their minds have been cultivated – because of the roles they play in a particular set of values and because of how they define, live out and prioritise these values, the roles of individuals in a society and the knowledge they have acquired.

It would be worth exploring further just how understanding language and culture together can enable us better to understand each. Such an understanding would also help to clarify how new languages or dialects or any other variants of speech come about. I think that this principle ‘you talk like who you talk with’ represents all human behaviour. We also eat like who we eat with, think like those we think with, etc. We take on a wide range of shared attributes – our associations shape how we live and behave and appear – our phenotype. Culture affects our gestures and our talk. It can even affect our bodies. Early American anthropologist Franz Boas studied in detail the relationship between environment, culture and bodily form. Boas made a solid case that human body types are highly plastic and change to adapt to local environmental forces, both ecological and cultural.

Less industrialised cultures show biology-culture connections. Among the Pirahã, facial features range impressionistically from slightly Negroid to East Asian, to Native American. Differences between villages or families may have a biological basis, originating in different tribes merging over the last 200 years. One sizeable group of Pirahãs (perhaps thirty to forty) – usually found occupying a single village – are descendants of the Torá, a Chapakuran-speaking group that emigrated to the Maici-Marmelos rivers as long as two centuries ago. Even today Brazilians refer to this group as Torá, though the Pirahãs refer to them as Pirahãs. They are culturally and linguistically fully integrated into the Pirahãs. Their facial features are somewhat different – broader noses, some with epicanthic folds, large foreheads – giving an overall impression of similarity to East Asian features. ‡ Yet body dimensions across all Pirahãs are constant. Men’s waists are, or were when I worked with them, uniformly 27 inches (68 cm), their average height 5 feet 2 inches (157.5 cm) and their average weight 55 kilos (121 pounds). The Pirahã phenotypes are similar not because all Pirahãs necessarily share a single genotype, but because they share a culture, including values, knowledge of what to eat and values about how much to eat, when to eat and the like.

These examples show that even the body does not escape our earlier observation that studies of culture and human social behaviour can be summed up in the slogan that ‘you talk like who you talk with’ or ‘grow like who you grow with’. And the same would have held for all our ancestors, even erectus .

Dark Matter of the Mind:
The Culturally Articulated Unconscious

by Daniel L. Everett
Kindle Locations 1499-1576

Thus while Tooby may be absolutely right that to have meaning, “culture” must be implemented in individual minds, this is no indictment of the concept. In fact, this requirement has long been insisted on by careful students of culture, such as Sapir. Yet unlike, say, Sapir, Tooby has no account of how individual minds— like ants in a colony or neurons in a brain or cells in a body— can form a larger entity emerging from multi-individual sets of knowledge, values, and roles. His own nativist views offer little insight into the unique “unconscious patterning of society” (to paraphrase Sapir) that establishes the “social set” to which individuals belong.

The idea of culture, after all, is just that certain patterns of being— eating, sleeping, thinking, posture, and so forth— have been cultivated and that minds arising from one such “field” will not be like minds cultivated in another “field.” The Dutch individual will be unlike the Belgian, the British, the Japanese, or the Navajo, because of the way that his or her mind has been cultivated— because of the roles he or she plays in a particular value grouping, because of the ranking of values that her or she has come to share, and so on.

We must be clear, of course, that the idea of “cultivation” we are speaking of here is not merely of minds, but of entire individuals— their minds a way of talking about their bodies. From the earliest work on ethnography in the US, for example, Boas showed how cultures affect even body shape. And body shape is a good indication that it is not merely cognition that is effected and affected by culture. The uses, experiences, emotions, senses, and social engagements of our bodies forget the patterns of thought we call mind. […]

Exploring this idea that understanding language can help us understand culture, consider how linguists account for the rise of languages, dialects, and all other local variants of speech. Part of their account is captured in linguistic truism that “you talk like who you talk with.” And, I argue, this principle actually impinges upon all human behavior. We not only talk like who we talk with, but we also eat like who we eat with, think like those we think with, and so on. We take on a wide range of shared attributes; our associations shape how we live and behave and appear— our phenotype. Culture can affect our gestures and many other aspects of our talk. Boas (1912a, 1912b) takes up the issue of environment, culture, and bodily form. He provides extensive evidence that human body phenotypes are highly plastic and subject to nongenetic local environmental forces (whether dietary, climatological, or social). Had Boas lived later, he might have studied a very clear and dramatic case; namely, the body height of Dutch citizens before and after World War II. This example is worth a close look because it shows that bodies— like behaviors and beliefs— are cultural products and shapers simultaneously.

The curious case of the Netherlanders fascinates me. The Dutch went from among the shortest peoples of Europe to the tallest in the world in just over one century. One account simplistically links the growth in Dutch height with the change in political system (Olson 2014): “The Dutch growth spurt of the mid-19th century coincided with the establishment of the first liberal democracy. Before this time, the Netherlands had grown rich off its colonies but the wealth had stayed in the hands of the elite. After this time, the wealth began to trickle down to all levels of society, the average income went up and so did the height.” Tempting as this single account may be, there were undoubtedly other factors involved, including gene flow and sexual selection between Dutch and other (mainly European) populations, that contribute to explain European body shape relative to the Dutch. But democracy, a new political change from strengthened and enforced cultural values, is a crucial component of the change in the average height of the Dutch, even though the Dutch genotype has not changed significantly in the past two hundred years. For example, consider figures 2.1 and 2.2. In 1825, US male median height was roughly ten centimeters (roughly four inches) taller than the average Dutch. In the 1850s, the median heights of most males in Europe and the USA were lowered. But then around 1900, they begin to rise again. Dutch male median height lagged behind that of most of the world until the late ’50s and early ’60s, when it began to rise at a faster rate than all other nations represented in the chart. By 1975 the Dutch were taller than Americans. Today, the median Dutch male height (183 cm, or roughly just above six feet) is approximately three inches more than the median American male height (177 cm, or roughly five ten). Thus an apparent biological change turns out to be largely a cultural phenomenon.

To see this culture-body connection even more clearly, consider figure 2.2. In this chart, the correlation between wealth and height emerges clearly (not forgetting that the primary determiner of height is the genome). As wealth grew, so did men (and women). This wasn’t matched in the US, however, even though wealth also grew in the US (precise figures are unnecessary). What emerges from this is that Dutch genes are implicated in the Dutch height transformation, from below average to the tallest people in the world. And yet the genes had to await the right cultural conditions before they could be so dramatically expressed. Other cultural differences that contribute to height increases are: (i) economic (e.g., “white collar”) background; (ii) size of family (more children, shorter children); (iii) literacy of the child’s mother (literate mothers provide better diets); (iv) place of residence (residents of agricultural areas tend to be taller than those in industrial environments— better and more plentiful food); and so on (Khazan 2014). Obviously, these factors all have to do with food access. But looked at from a broader angle, food access is clearly a function of values, knowledge, and social roles— that is, culture.

Just as with the Dutch, less-industrialized cultures show culture-body connections. For example, Pirahã phenotype is also subject to change. Facial features among the Pirahãs range impressionistically from slightly Negroid to East Asian to American Indian (to use terms from physical anthropology). Phenotypical differences between villages or families seem to have a biological basis (though no genetic tests have been conducted). This would be due in part to the fact Pirahã women have trysts with various non-Pirahã visitors (mainly river traders and their crews, but also government workers and contract employees on health assistance assignments, demarcating the Pirahã reservation, etc.). The genetic differences are also partly historical. One sizeable group of Pirahãs (perhaps thirty to forty)— usually found occupying a single village— are descendants of the Torá, a Chapakuran-speaking group that emigrated to the Maici-Marmelos rivers as long as two hundred years ago. Even today Brazilians refer to this group as Torá, though the Pirahãs refer to them as Pirahãs. They are culturally and linguistically fully integrated into the Pirahãs. Their facial features are somewhat different— broader noses; some with epicanthic folds; large foreheads— giving an overall impression of similarity to Cambodian features. This and other evidence show us that the Pirahã gene pool is not closed. 4 Yet body dimensions across all Pirahãs are constant. Men’s waists are or were uniformly 83 centimeters (about 32.5 inches), their average height 157.5 centimeters (five two), and their average weight 55 kilos (about 121 pounds).

I learned about the uniformity in these measurements over the past several decades as I have taken Pirahã men, women, and children to stores in nearby towns to purchase Western clothes, when they came out of their villages for medical help. (The Pirahãs always asked that I purchase Brazilian clothes for them so that they would not attract unnecessary stares and comments.) Thus I learned that the measurements for men were nearly identical. Biology alone cannot account for this homogeneity of body form; culture is implicated as well. For example, Pirahãs raised since infancy outside the village are somewhat taller and much heavier than Pirahãs raised in their culture and communities. Even the body does not escape our earlier observation that studies of culture and human social behavior can be summed up in the slogan that “you talk like who you talk with” or “grow like who you grow with.”