Jo Walton On Death

At present, we’re reading a fiction book, Jo Walton’s Or What You Will, and enjoying it. We’ve never read the author before, but plan on looking for more of her writings. This book has elements of postmodernism about it, in how the narrator and some of the characters speak about the story, storytelling, and such; the breaking of the fourth wall. But it’s probably better described as metamodern, the combining of how modernity takes itself seriously and how the postmodern stands back in self-conscious critique, mixed together with experimental playfulness and sincere inquiry; all the while touching on various themes and topics, casual and weighty; but always coming back to what a narratizing voice(s) means, dipping into the territory of the bundled mind.

A focus of concern the author returns to throughout the book is mortality and the desire for immortality; how the human relationship to death has changed over time, how we speak about it and narrate it, and how it shapes us and our culture. Besides comparing present attitudes and responses to earlier times, at one point she contrasts the modern detachment, confusion, and discomfort with human death with the modern ‘extravagant grieving’ over pets. Anyone in our society is familiar with what she is talking about. A coworker of ours, when her lizard died, was so distraught she couldn’t talk about it, to the extent that she became upset when we brought it up, in our trying to be consoling; as we had often talked about her lizard in the past. She was not consoled. It might be the most inconsolable we’ve ever seen anyone about any death of any species.

For whatever reason, we’ve never been that way about death, be it human or non-human; even with a cat we had for more than 20 years, a cat that had sometimes been our sole companion in dark periods. So far, we’ve tended to take it all in stride, with what acceptance we can muster. It’s sad, but so much of life can be sad, if one thinks about it; the world is full of pain and suffering, where death is the least of it and sometimes a blessing. Maybe a lifetime of depression and depressive realism has inoculated us to such things. Death is death. There’s nothing to be done about it, other than to avoid it for as long as is possible and desirable. But as someone who once attempted suicide and used to contemplate it often, we don’t hold it against people who decide to opt out early. Most modern people, though, don’t have this perspective.

p. 111-113:

“If you are a modern person, in our world now, it’s not unlikely that you might not have known the true grief and loss caused by the death of someone close to you until well into adulthood. […] The estrangement from grieving is a change in human nature, and one that has happened over the course of Sylvia’s lifetime. Young people today are not the same as they were when she was young. […] Death comes to them as a stranger, not an intimate. She notices it first in what she sees as extravagant grieving for animals, and then starts to notice it more when her friends lose parents at older and older ages, and take it harder and harder. […] Then she observes a growing embarrassment in younger people around the mention of real death, where people don’t know what to say or how to reaction, until talking about it is almost a taboo. Simultaneous with this came the rise of the vampire as attractive, sexual, appealing, rather than a figure of horror. […] Other undead have also undergone this process in art, even zombies by the first decade of the new millennium. Friends with no religion, who mock Sylvia for her vestigial Catholicism, revert to strained religiosity in the face of death because they have no social patterns of coping.

“Look at it this way: Freud wasn’t necessarily wrong about Thanatos. But he was living in a different world, before antibiotics. His patients were very different people from the people of this century. They would all and every one of them have lost siblings, school friends, parents. […] We read Freud now, and wonder how he could have thought of some of these things, but his patients lived crowded together in houses with one bathroom or none, where they shared rooms with their dying siblings and fornicating parents, and where death was a constant and familiar presence. Nor did they feel grief any less for the familiarity of loss. Read Victorian children’s books; read Charlotte M. Yonge (as Sylvia did as a child in her grandmother’s house) and see what a constant presence death is, almost a character—and not necessarily violent death, but death by illness or accident, inevitable death that simply cannot be cured. We mock their wallowing in woe, the crepe, the widow’s weeds, the jewelry made of jet and hair, the huge mausoleums, the black and purple mourning clothes, until we are faced with our own absences in emptiness, with nothing at all to console us and no signals to send to warn others to tread lightly. […]

pp. 115-116:

“Death in fantasy, is generally defanged. Ever since Tolkien brought Gandalf back, and Lewis resurrected Aslan, both of then in conscious imitation of Christ, and right at the beginning of the shaping of genre fantasy, death in fantasy novels has been more and more negotiable. It’s more unusual for a beloved character to stay dead than for them to come back to life. Death is for enemies and spear carriers, and the way a spear carrier death is treated is that the main characters will have a single dramatic scene of mourning and then rarely think of them once they turn the page at the end of the chapter. Boromir’s death resonates through the rest of The Lord of the Rings but the imitations of it lesser writers put in do not. Tolkien and Lewis lived through the Great War, and saw as much death as anyone has. Their imitators are modern people, whose understanding of death is much less visceral. Modern fantasy, even, and perhaps especially “grim-dark” fantasy, is often written by people without much close-up experience of death. The horror of the Dead Marshes, in Tolkien, comes direct from Flanders field. They are not there for thrills.

“As for resurrections—she goes to San Marco and sees Fra Angelico’s paintings of the angel in the empty tomb, and Christ harrowing Hell and opening the door that has been closed for so long and letting in light where there was only darkness. The easy way people come back to life in fantasy cheapens resurrection. The ultimate mystery of Christianity becomes commonplace, with the extreme version of the cheapening happening in computer games where there can be an actual fixed price in gold for bringing a party member back to life.”

COVID-19 and States, Lives and Jobs

In reference to the below COVID-19 graph of loss of life and jobs (per capita), someone wrote to us that the, “Lower left would appear better [i.e., more people alive and working. BDS]. Iowa was slightly lower left, but mostly in the center of all states. Hawaii had lowest excess death rate (negative), but highest job loss. West Virginia, Maine, and Indiana were well balanced.” The graph is from Hamilton Place Strategies. It is included with their brief data analysis as presented in the recent (4/18/21) article, 50 States, 50 Pandemic Responses: An Analysis Of Jobs Lost And Lives Lost, co-authored by Matt McDonald, Stratton Kirton, Matisse Rogers, and Johnny Luo. The time period for the data is unstated, which could make a difference. That aside, most of the states clump near the center; although more states tended toward higher death toll; but, of course, it’s the outliers in the four quadrants that grab one’s attention.

We didn’t initially give it much careful thought, even though such data does make one curious about what it represents, beyond some seemingly obvious observations. Here was our initial off-the-cuff response: “It maybe should be unsurprising that the most populated states struggled the most with finding a balance or, in some cases, keeping either low.” That was tossed out as a casual comment and it was assumed no explanation was necessary. But apparently it was perceived as surprising (or speculative or something) to our collocutor who asked, “Why?” This seems to happen to us a lot, in that we are so used to looking at data that we assume background knowledge and understanding that others don’t always share. It genuinely was not surprising to us, in that ‘populated’ clearly signifies particular kinds of factors and conditions. Once committed to the dialogue, we felt compelled to answer and explain. Continue further down, if you wish to see the unpacking of background info and social context that, once known, makes the graphed data appear well within the range of what might be expected.

It seemed unsurprising to us, as we’ve looked at a lot of analysis of (demographic, economic, and social science) data like this over the years. So, we’re familiar with the kinds of patterns that tend to show up and probable explanations for those patterns. But maybe it seems less intuitively obvious to others (or maybe we’re biased in our views; you can be the judge). In the original article, the authors do note some relevant correlations indicating causal factors: “States with major hospitality and tourism sectors were hit hard in terms of job loss, with the impact falling unevenly across sectors. And states that were in the first wave of infections—when the healthcare system was still learning how to treat COVID-19—fared comparatively worse on their death tolls. New York, which falls into both categories, had the worst overall outcome, with both high excess deaths and high job losses.”

The authors go on to say, “The states that emerged in the best position were Idaho, Utah, and West Virginia, all with some combination of low loss of life and low loss of employment.” Others that did reasonably well were North Carolina, Nebraska, Maine, West Virginia, Indiana, and Wyoming. I don’t recall any of these being hit early by COVID-19 outbreaks nor are they major tourist and travel destinations, other than NC to some extent. It could also be noted that all are largely rural states, if not as rural as they were last century, but still way more rurally populated (or rather less urbanized with fewer big cities and metropolitan areas) than states that had it rough in soaring death and jobless rates: New York, New Jersey, Louisiana, etc. It comes down to a divide between more and less urbanized, and hence more and less populated and dense. That has much to do with the historical economic base that determined how many people, over the generations, have moved to a state and determined their residential location.

As for the really obvious observations, there is the typical clear divide between North and South. Many liberty-minded Southern states, with historically high rates of total mortality and work-related mortality (along with historically overlapping classism and racism), were tolerant of sacrificing the lives of disproportionately non-white workers during a pandemic, particularly when it kept the economy going and maintained corporate profits for a mostly white capitalist class (see: Their Liberty and Your Death). ln general, all of the Deep South and Southwest states, along with most of the Upper South states, had above average death tolls (with MS, AL, AZ, and SC leading the pack); whether or not they kept job losses low, although they did mostly keep them down. All of the states that sacrificed jobs to save lives are in the North (AK, RI, MN, MA, etc) or otherwise not in the South (HI), be it caused by intentional policy prioritization or other uncontrollable factors (e.g., reduced tourism). Northern industrial states, as expected, took the biggest economic hit.

As for the initial point we made, larger populations that are more concentrated create the perfect storm of conditions for promoting the spread of contagious diseases. This represents numerous factors that, though any single factor might not be problematic, when all factors are taken together could overwhelm the system during a large-scale and/or long-term crisis. That typically describes states with large cities and metropolitan areas. Look at all of the highly populated and urbanized states and, no matter what region they’re in, they are all near the top of excess deaths per capita. None of them managed to balance keeping people alive and employed, though some did relatively less worse. And it is apparent that the worst among them had the highest population density. That last factor might be the most central.

For comparison, here is the land area, population, and population density of the top 6 largest US cities, all in different states: New York City (301.5 sq mi; 8,336,817; 28,317/sq mi), Los Angeles (468.7 sq mi; 3,979,576; 8,484/sq mi), Chicago (227.3 sq mi; 2,693,976; 11,900/sq mi), Houston (637.5 sq mi; 2,320,268; 3,613/sq mi), Phoenix (517.6 sq mi; 1,680,992; 3,120/sq mi), and Philadelphia (134.2 sq mi; 1,584,064; 11,683/sq mi). New York City has about half the land as Houston and Phoenix, but has about four times the population of Houston and about seven times the population of Phoenix. So, even among the largest cities in the US and the world, there are immense differences in population density. States like Texas and Arizona have encouraged urban sprawl which, though horrible for environmental health, does ease the pressure of contagious disease spread.

This particular pattern of public health problems is seen all the way back to the first era of urbanization with the agricultural revolution when populations were concentrating, not sprawling. It wasn’t merely the nutritional deficiencies and such from change in the agricultural diet. The close proximity of humans to each other and to non-human animals allowed diseases to mutate more quickly and spread more easily (a similar probable reason for COVID-19 having originated in China with wilderness encroachment, habitat destruction, and wild meat markets). Many new diseases appeared with the rise of agricultural civilizations. Even diseases like malaria are suspected to have originated in farming populations before having spread out into wild mosquitoes and hunter-gatherer tribal populations. Even in modern urbanization, humans continue to live closely to and even cohabitate with non-human animals. This is why populations in New England, where indoor cats are common, have high rates of toxoplasmosis parasitism, despite a generally healthy population.

Plus, at least in the US, these heavily urbanized conditions tend to correlate with high rates of poverty, homelessness, and inequality (partly because most of the poor left rural areas to look for work in cities where they became concentrated) — these high rates all strongly correlated to lower health outcomes, particularly the last, inequality. Of the only four states with above average economic inequality in the US, three of them (NY, LA, CA) had all around bad COVID-19 outcomes, with only high inequality Connecticut escaping this pattern by remaining moderate on job losses and excess deaths. As expected, the states that did the best in keeping both low were mostly low inequality. Other than two in the mid-range (WV, NC), all of the other cases of COVID-19 success are among the lowest inequality states in the country — according to ranking: 1) UT, 4) WY, 7) NE, 12) ID, 13) ME, and 15) IN. All of the top 10 low inequality states were low in COVID-related mortality and/or unemployment. That result, by the way, is completely predictable as it matches decades of data on economic inequality and health outcomes. It would be shocking if this present data defied the longstanding connection.

By the way, rural farm and natural resource states tend to be low inequality, whether or not they are low poverty, but research shows that even poverty is far less problematic with less inequality — as economic inequality, besides being a cause or an indicator of divisiveness and stress, correlates to disparities in general: power, representation, legacies, privileges, opportunities, resources, education, healthy food, healthcare, etc (probably entrenched not only in economic, political, and social systems but also epigenetics; maybe even genetics since toxins and other substances, such as oxidized seed oils in cheap processed foods, can act as mutagens which can permanently alter inherited genes; and so inequality gets built into biology, individually and collectively, immediately and transgenerationally). Certain economic sectors tend toward such greater or lesser inequities, and this generally corresponds to residential patterns. But the correlation is hardly causally deterministic, considering the immense variance of inequality among advanced Western countries with more similar cultural and political traditions (party-based representative democracies, individualistic civil rights, and relatively open market economies).

The economic pattern is far different between rural states and urban states, specifically mass urbanization as it’s taken shape over the generations, and it has much to do with historical changes (e.g., factories closed in inner cities and relocated to suburbs and overseas). In big cities, many large populations of the poor (disproportionately non-white) have become economically segregated and concentrated together in ghettoes, old housing, and abandoned industrial areas (because of generations of racist redlining, covenants, loan practices, and employment). These are the least healthy people living in the least healthy conditions (limited healthcare, lack of parks and green spaces, lead toxicity, air pollution, high stress, food deserts, malnutrition, processed foods, etc), all strongly tied to COVID-19 comorbidities. In these population dense and impoverished areas, there is also a lack of healthcare infrastructure and staffing that is especially needed during a public health crisis, and what healthcare exists is deficient and underfunded.

To complicate things, such densely populated areas of mass urbanization make public health difficult because there are so many other factors as well. Particularly in American cities with immigrant and ethnic residents historically and increasingly attracted to big cities, additional factors include diverse sub-populations, neighborhoods, housing conditions, living arrangements, places of employment, social activities, etc. And all of these factors are overlapping, interacting, and compounding in ways not entirely predictable. This might be exacerbated by cultural diversity, since each culture would have varying ways of relating to issues of health, healthcare, and authority figures; such as related to mask mandates, vaccination programs, etc. It would be challenging to successfully plan and effectively implement a single statewide or citywide public health policy and message; as compared to a mostly homogeneous small population in a small rural state (or even a mostly homogeneous small population in a small urban country).

Also, disease outbreaks in big cites and metropolitan areas are much harder to contain using isolation and quarantines, as many people live so close together in apartment buildings and high-rises, particularly the poor where larger numbers of people might be packed into single apartments and/or multiple generations in a single household, and that is combined with more use of mass public transit. This came up as an issue in some countries such as in Southern Europe. Italians tend to live together in multigenerational households and tend to take in family members when unemployed. Combined with poverty, inequality, and policies of economic austerity, the Italian government’s struggle to contain the COVID-19 pandemic made it stand out among Western countries, such that it early on showed potential risks to failing to quickly contain the pandemic. But, in many ways, it might have been as much or more of a sociocultural challenge than a political failure.

On the completely opposite extreme, the Swedish have the highest rate in the world of people living alone, but also some of the lowest poverty and inequality in the world. So, even though Sweden is heavily urbanized (88.2%), contagious disease control is easier; particularly with an already healthy population, universal healthcare, and a well-funded public health system (no economic austerity to be found in Swedish social services). Indeed, they only had to implement moderate public measures and, with a high trust culture, most of the citizenry willingly and effectively complied without it becoming a politicized and polarized debate involving a partisan battle for power and control. By the way, Sweden has a national population only slightly above NYC but less than the NYC metro. Of Nordic cities, Stockholm is the largest in area and the most population dense: total density (13,000/sq mi), urban density (11,000/sq mi), and metro density (950/sq mi). New York City has about two and a half times that urban density.

Then again, all of that isolated urbanization takes it’s toll in other ways, such as a higher suicide rate (is suicide contagious?). It is one of the most common causes of death in Sweden and the highest rate in the West; in the context of Europe being one of the most suicidal continents in the world, although it’s Eastern Europe that is really bad. Among 182 countries, Sweden is 32nd highest in the world with 13.8 suicides per 100,000; compared to Italy at 142nd place with 5.5 suicides per 100,000. That is two and half times as high. But, on a positive note, COVID-19 seems to have had no negative impact in worsening the Swedish suicide epidemic (Christian Rück et al, Will the COVID-19 pandemic lead to a tsunami of suicides? A Swedish nationwide analysis of historical and 2020 data), as presumably being socially isolated or at least residentially isolated is already normalized. If anything, suicidal inclinations might become less compelling or at least suicide attempts no more likely with the apparently successful response of the Swedish government to COVID-19, especially combined with the Swedish culture of trust. Not that global pandemic panic and local pandemic shutdown would be a net gain for Swedish mental health (Lance M. McCracken et al, Psychological impact of COVID-19 in the Swedish population: Depression, anxiety, and insomnia and their associations to risk and vulnerability factors).

So, theoretically, public health during pandemics doesn’t necessarily have to be worse in large dense urban areas, as other factors might supersed. But, unfortunately, it apparently was worse in the US under present (social, economic, and political) conditions, however those conditions came about (a whole other discussion barely touched upon here). Many of the states that fared badly are massively larger than Sweden. As seen with New York City, the US has cities and metros that are larger than many countries in the world. These unique conditions of not merely mass urbanization but vast urbanization have never before existed in global history. The US population now in the COVID-19 outbreak is more than three times larger than during the 1918 Flu. The five boroughs of NYC have almost doubled in population over the past century with Queens almost five times as populated, and surely the NYC metro area has increased far more.

Places like Houston, Los Angeles, Chicago, and New York City are hubs in immense systems of commerce, transport, and travel with heavily used airports and sea ports, interstate highways and railways, a constant flow of people and products from all over the country and the world (the rise of mass world travel and troop transport was a key factor in the 1918 Flu, helping it to mutate and spread in the deadly second and third waves). Systems thinking and complexity theory have come up in our studies and readings over the years, including in discussions with our father whose expertise directly involves systems used in businesses and markets, particularly factory production, warehousing, and supply chains. Those are relatively simple systems that can to varying degrees be analyzed, predicted, planned, and controlled. But massive and dense populations in highly connected urban areas are unimaginably complex systems with numerous confounding factors and uncontrolled variables, unintended consequences and emergent properties. Add a pandemic to all of that and we are largely in unknown territory, as the last pandemic in the US was over a century ago when the world was far different.

Also, there is there is the issue of how systems differ according to locations and concentrations of various demographics, specifically in contrasting the privileged and underprivileged. That goes back to the issue of poverty, inequality much else. A major reason we’ve had so many problems is because most politicians, lobbyists, media figures, public intellectuals, and social influencers involved in the ‘mainstream’ debate that gets heard and televized are living in separate comfortable, safe, and healthy communities, as separate from both the rural and urban masses, particularly separate from minorities, the poor, and the working class (see: Mental Pandemic and Ideological Lockdown). We could note that the individual who originally showed us the graphed data, as mentioned at the beginning of the post, is of this typical demographic of wealthier urban white who has never personally experienced impoverished population density (AKA slums or ghettoes). And even though urban, like us, he lives in this same rural state with clean air surrounded by open greenspace of parks, woods, and farms; not to mention being smack dab in the middle of the complete opposite of a food desert. This could be why our reference to ‘populated’ states could gain no purchase in his mind and imagination.

Obviously, as complex systems, the densely populated big cities and metros described above aren’t isolated and insular units, contained and controlled experiments. Their populations and economies are inseparable from the rest of the global society, even more true in this age of neoliberal globalization. That would complicate pandemic response in dealing alone with either excess deaths per capita or job loss per capita, but that would exacerbate further the even greater complexity of finding a balance between the two. When these major centers of industrial production, service industry, commerce, trade, transportation, marketing, and finance get shut down (for any reason) and/or when other closely linked major centers get shut down, it severely cripples the entire economy and employment of the state, even ignoring the potential and unpredictable pandemic threat of overwhelmed hospitals, death toll, and long-term health consequences. Economic and public health effects could ripple out and in with secondary and tertiary effects.

It’s not anything like less populated rural farm states and natural resource states where, no matter what is going on in the rest of the country and world, the local population is more isolated and the local economy usually keeps trucking along. The Iowa economy and housing, for example, was barely affected by the 2008 Recession. Indeed, for all its failed state leadership in dealing with COVID-19, low inequality and low poverty Iowa was below average on both job losses and excess deaths. So, if Iowa could do better than most states, in spite of horrible leadership by the Trump-aligned Governor Kim Reynolds (even our Republican parents despise her handling of the crisis), maybe governments in other states also don’t necessarily deserve as much of the blame or credit they are given, at least not in terms of the immediate pandemic response, although long-term public health planning and preparation (over years and decades) would still be important.

That is to say, the situation is complicated. Yet we seem to know what are some of the key complications, however entangled they may be as potentially causal or contributing. It’s a large web of factors, but strong correlations can be discerned, all of it mostly following already known patterns, but of course we are biased in what we notice according to our focus. The data gathered and analyzed this past year, as far as we can tell, is not fundamentally different in nature than any other data gathered and analyzed over the past century. So, even though COVID-19 is a highly unusual event, what is seen in the data isn’t likely to be surprising, even if requiring multiple layers and angles of interpretation. Still, unexpected results would be welcome in possibly indicating something new and interesting. Serious study of this pandemic has barely begun. The data will keep rolling in. Then decades of debate and theorizing will follow. Some of the observations offered here might to varying degrees stand the test of time, such as the well-established inequality links, but surely much of it might prove false, dubious, misleading, or partial. Many questions remain unanswered and, in some cases, unasked.

The Pandemic Now And Going Into The Future

“I think people haven’t understood that this isn’t about the next couple of weeks. This is about the next two years.”
~Michael Osterholm, infectious-disease epidemiologist at the University of Minnesota

“Everyone wants to know when this will end. That’s not the right question. The right question is: How do we continue?”
~Devi Sridhar, public-health expert at the University of Edinburgh

A week ago, the highest daily Covid-19 death count for the US was more than 2,000. Now it reached over 4,500 over the past day. That is an expected exponential increase. And that is with strong measures like lockdowns taken place across the country. When doing a recount by adding in all deaths now known, China increased their Wuhan deaths by 50%. That is probably true in many places where hospitals were overwhelmed and many died without medical care.

This isn’t to imply China was necessarily being deceptive in covering up the real numbers. For a while now, medical staff in the US have said the same thing about hospitals here underreporting Covid-19 deaths. Healthcare worker deaths may also be higher. In another article, there was shared the photographs and stories of some of these people who died while helping others. I noticed that all of them looked overweight, indicating metabolic syndrome which is one of the main comorbidities.

By the way, one expert talks about five stages for the pandemic. We are in the second phase which is mitigation following the initial containment. After that will be another period of containment while we wait for a vaccine, other treatments, and improved lab testing. That could take us into next year, but the economy will begin to restart during this time.

As communities begin to open up again, the government will have to become very strict, systematic, and targeted in quarantining the infected. Cleaning and disinfection of public places will become a priority, as will the use of protective gear. The fourth stage comes when we have a vaccine, assuming we get one in the relatively near future. The hope is to be in a more advanced situation of containment before a second wave of infections might hit in the fall.

With everything reasonably under control, we end with the last stage where we assess the situation, determine successes and failures, and then prepare for the next pandemic. That means making pandemic preparation central to national security.

This situation, of course, has long term consequences. Donald Trump being president exacerbates this. Even before the pandemic, his actions as leader were driving a wedge between the US and its allies. Many foreign governments were seeing the US as no longer trustworthy and reliable. Trump’s attacking and defunding the WHO, if somewhat deserved, has further undermined US authority — specifically among the G7. The US might never recover its position in the world. This might be the end of US hegemony.

Now most likely Trump will be re-elected. So four more years of more of the same, precisely at the moment when confidence has been shaken in national leadership and the federal government. The main promise Trump made was that he would make the American economy great again, but now it will be in shambles. All his scapegoating will only go so far. While Americans suffer, people will want actions and reform, not snarky blame games for political gain.

For years and maybe decades to come, we might not only be recovering from the pandemic and all that is related to it but a more general sense of decline and malaise, if not further catastrophes that become existential crises. If we are to enter a re-building phase, it’s going to require entirely new leadership in both of the main parties. We can hope for an era of large-scale reform that will transform our society, but it’s hard to see hope at the moment.

* * *

Some articles of interest:

Some Thoughts On Thinking Critically In Times Of Uncertainty, And The Trap of Lopsided Skepticism: Coronaspiracy Theory Edition
by Denise Minger

In case you didn’t notice, the cyber-world (and its 3D counterpart, I assume, but we’re not allowed to venture there anymore) is currently a hot mess of Who and what do we believe? This is zero percent surprising. Official agencies have handled COVID-19 with the all grace of a three-legged elephant—waffling between the virus being under control/not under control/OMG millions dead/wait no 60,000/let’s pack the churches on Easter!/naw, lockdown-til-August/face masks do nothing/face masks do something, but healthcare workers need them more/FACE MASKS FOR EVERY FACE RIGHT NOW PLEASE AND THANK YOU/oh no a tiger got the ‘rona!; on and on. It’s dizzying. Maddening. The opposite of confidence-instilling. And as a very predictable result, guerrilla journalism has grown to fill the void left by those who’ve failed to tell us, with any believability, what’s going on.

Exercising our investigative rights is usually a good thing. You guys know me. I’m all about questioning established narratives and digging into the forces that crafted them. It’s literally my life. Good things happen when we flex our thinking muscle, and nothing we’re told should be immune to scrutiny.

But there’s a shadow side here, too—what I’ll henceforth refer to as “lopsided skepticism.” This is what happens when we question established narratives… but not the non-established ones. More specifically, when we go so hog wild ripping apart The Official Story that we somehow have no skepticism left over for all the new stuff we’re replacing it with.

And that, my friends, is exactly what’s happening right now.

The dangerous conservative campaign against expertise
by Michael Gerson

Motivated reasoning is usually just tiresome. At its worst, it can be dangerous. Sometimes drawing the wrong lesson badly obscures a right and necessary lesson. Sometimes the interpretation of a crisis is so dramatically mistaken, so ludicrous and imprudent, that it can worsen the crisis itself.

Such is the case with conservatives who look at the coronavirus outbreak and see, of all things, the discrediting of experts and expertise. In this view, the failures of the World Health Organization (WHO) and the Centers for Disease Control and Prevention (CDC) have brought the whole profession into disrepute. The judgments of health professionals have often been no better than the folk wisdom of the Internet. The pandemic is not only further proof of the fallibility of insiders; it has revealed the inherent inaccessibility of medical truth. All of us, scientists and nonscientists, are walking blindly on the same misty moor and may stumble on medical insights.

This argument assumes an intellectual fog that is just lifting. Though we are still relatively early in the pandemic, this much seems clear: The medical experts recommended aggressive social distancing to bend the curve of infections and deaths downward. Americans generally trusted the experts. By all the evidence, aggressive social distancing is bending the curve of infections and deaths downward. And places that were earliest and most aggressive in this approach have seen the best results.

This outcome doesn’t strike me as murky. It is difficult to see how experts whose advice clearly saved tens of thousands of lives can be called discredited. It is easy, however, to see how making this false claim might undermine public adherence to their advice, which still matters greatly in the crisis.

Our Pandemic Summer
by Ed Yong

If it turns out that, say, 20 percent of the U.S. has been infected, that would mean the coronavirus is more transmissible but less deadly than scientists think. It would also mean that a reasonable proportion of the country has some immunity. If that proportion could be slowly and safely raised to the level necessary for herd immunity—60 to 80 percent, depending on the virus’s transmissibility—the U.S. might not need to wait for a vaccine. However, if just 1 to 5 percent of the population has been infected—the range that many researchers think is likelier—that would mean “this is a truly devastating virus, and we have built up no real population immunity,” said Michael Mina, an epidemiologist and immunologist at Harvard. “Then we’re in dire straits in terms of how to move forward.”

Even in the optimistic scenario, a quick and complete return to normalcy would be ill-advised. And even in the pessimistic scenario, controlling future outbreaks should still be possible, but only through an immense public-health effort. Epidemiologists would need to run diagnostic tests on anyone with COVID-19–like symptoms, quarantine infected people, trace everyone those people had contact with in the previous week or so, and either quarantine those contacts or test them too. These are the standard pillars of public health, but they’re complicated by the coronavirus’s ability to spread for days before causing symptoms. Every infected person has a lot of potential contacts, and may have unknowingly infected many of them.

The Pandemic Will Cleave America in Two
by Joe Pinsker

When someone dies, there are three ways to think about what caused it, according to Scott Frank, a professor at Case Western Reserve University’s School of Medicine. The first is the straightforward, “medical” cause of death—diagnosable things like heart disease or cancer. The second is the “actual” cause of death—that is, the habits and behaviors that over time contributed to the medical cause of death, such as smoking cigarettes or being physically inactive. The third is what Frank refers to as the “actual actual” cause of death—the bigger, society-wide forces that shaped those habits and behaviors.

In one analysis of deaths in the U.S. resulting from “social factors” (Frank’s “actual actual” causes), the top culprits were poverty, low levels of education, and racial segregation. “Each of these has been demonstrated to have independent effects on chronic-disease mortality and morbidity,” Frank said. (Morbidity refers to whether someone has a certain disease.) He expects that the same patterns will hold for COVID-19.

To begin with, the physical effects of COVID-19 are far worse for some people than others. There are two traits that seem to matter most. The first is age. Older people are at greater risk of experiencing the more devastating version of the pandemic, in part because the immune system weakens with age. Early data from the Centers for Disease Control and Prevention indicate that, in the U.S., the risk of dying from the disease begins to climb at around age 55, and is especially acute for those 85 and older. “I think the pattern we’re going to see clearly is an age-related pattern” of mortality, Andrew Noymer, a public-health professor at UC Irvine, said. (Younger people aren’t invulnerable to the disease, though; the CDC found in mid-March that 20-to-54-year-olds had accounted for almost 40 percent of hospitalizations known to have been caused by the disease.

The second trait that puts someone at increased risk is having a serious health condition such as diabetes, heart disease, or lung disease. These conditions seem to make cases of COVID-19 more likely to be severe or fatal, and the risks rise considerably for older adults who have any of these conditions, Frank told me.

But while everyone ages, rich and poor alike, these health conditions are not evenly distributed throughout the population. They’re more common among people with less education, less money, and less access to health care. “We know these social and economic conditions have a profound effect on chronic disease,” Frank said, “and then chronic disease has a profound effect on the mortality related to COVID.”

“For the average American or European, Coca-Cola poses a far deadlier threat than al-Quaeda.”

Homo Deus: A Brief History of Tomorrow
by Yuval Noah Harari

  • “Poverty certainly causes many other health problems, and malnutrition shortens life expectancy even in the richest countries on earth. In France, for example, 6 million people (about 10 percent of the population) suffer from nutritional insecurity. They wake up in the morning not knowing whether they will have anything to eat for lunch: they often go to sleep hungry; and the nutrition they do obtain is unbalanced and unhealthy — lots of starches, sugar and salt, and not enough protein and vitamins. Yet nutritional insecurity isn’t famine, and France of the early twenty-first century isn’t France of 1694. Even in the worst slums around Beauvais or Paris, people don’t die because they have not eaten for weeks on end.”
  • “Indeed, in most countries today overeating has become a far worse problem than famine. In the eighteenth century Marie Antoinette allegedly advised the starving masses that if they ran out of bread, they should just eat cake instead. Today, the poor are following this advice to the letter. Whereas the rich residents of Beverly Hills eat lettuce salad and steamed tofu with quinoa, in the slums and ghettos the poor gorge on Twinkie cakes, Cheetos, hamburgers and pizza. In 2014 more than 2.1 billion people were overweight compared to 850 million who suffered from malnutrition. Half of humankind is expected to be overweight by 2030. In 2010 famine and malnutrition combined killed about 1 million people, whereas obesity killed 3 million.”
  • “During the second half of the twentieth century this Law of the Jungle has finally been broken, if not rescinded. In most areas wars became rarer than ever. Whereas in ancient agricultural societies human violence caused about 15 per cent of all deaths, during the twentieth century violence caused only 5 per cent of deaths, and in the early twenty-first century it is responsible for about 1 per cent of global mortality. In 2012, 620,000 people died in the world due to human violence (war killed 120,000 people, and crime killed another 500,000). In contrast, 800,000 committed suicide, and 1.5 million died of diabetes. Sugar is now more dangerous than gunpowder.”
  • “What about terrorism, then? Even if central governments and powerful states have learned restraint, terrorists might have no such qualms about using new and destructive weapons. That is certainly a worrying possibility. However, terrorism is a strategy of weakness adopted by those who lack access to real power. At least in the past, terrorism worked by spreading fear rather than by causing significant material damage. Terrorists usually don’t have the strength to defeat an army, occupy a country or destroy entire cities. In 2010 obesity and related illnesses killed about 3 million people, terrorists killed a total of 7697 people across the globe, most of them in developing countries. For the average American or European, Coca-Cola poses a far deadlier threat than al-Quaeda.”

Harari’s basic argument is compelling. The kinds of violence and death we experience now is far different. The whole reason I wrote this post is because of a few key points that stood out to me: “Sugar is now more dangerous than gunpowder.” And: “For the average American or European, Coca-Cola poses a far deadlier threat than al-Quaeda.” As those quotes make clear, our first world problems are of a different magnitude. But I would push back against his argument, as for much of the rest of the world, in his making the same mistake as Steven Pinker by ignoring slow violence (so pervasive and systemic as to go unnoticed and uncounted, unacknowledged and unreported, often intentionally hidden). Parts of the United States also are in third world conditions. So, it isn’t simply a problem of nutritional excess from a wealthy economy. That wealth isn’t spread evenly, much less the nutrient-dense healthy foods or the healthcare. Likewise, the violence oppression falls harder upon some than others. Those like Harari and Pinker can go through their entire lives seeing very little of it.

Since World War Two, there have been thousands of acts of mass violence: wars and proxy wars, invasions and occupations, bombings and drone strikes; covert operations in promoting toppled governments, paramilitaries, and terrorists; civil wars, revolutions, famines, droughts, refugee crises, and genocides; et cetera. Most of these events of mass violence were directly or indirectly caused by the global superpowers, besides through military aggression and such, in their destabilizing regions, exploiting third world countries, stealing wealth and resources, enforcing sanctions on food and medicine, economic manipulations, debt entrapment, artificially creating poverty, and being the main contributors to environmental destruction and climate change. One way or another, these institutionalized and globalized forms of injustice and oppression might be the combined largest cause of death, possibly a larger number than in any society seen before. Yet they are rationalized away as ‘natural’ deaths, just people dying.

Over the past three-quarters of a century, probably billions of people in world have been killed, maimed, imprisoned, tortured, starved, orphaned, and had their lives cut short. Some of this was blatant violent actions and the rest was slow violence. But it was all intentional, as part of the wealthy and powerful seeking to maintain their wealth and power and gain even more. There is little justification for all this violence. Even the War on Terror involved cynical plans for attacking countries like Iraq that had preceded the terrorist attacks themselves. The Bush cronies, long before the 2000 presidential election, had it written down on paper that they were looking for an excuse to take Saddam Hussein out of power. The wars in Afghanistan and Iraq killed millions of people, around 5% or so of the population (the equivalent would be if a foreign power killed a bit less than 20 million Americans). The used uranium weapons spread across the landscape will add millions of more deaths over the decades — slow, torturous, and horrific deaths, many of them children. Multiply that by the hundreds of other similar US actions, and then multiply that by the number of other countries that have committed similar crimes against humanity.

Have we really become less violent? Or has violence simply taken new forms? Maybe we should wait until after the coming World War Three before declaring a new era of peace, love, and understanding. Numerous other historical periods had a few generations without war and such. That is not all that impressive. The last two world wars are still in living memory and hence living trauma. Let’s give it some time before we start singing the praises and glory of our wonderful advancement as a civilization guided by our techno-utopian fantasies of Whiggish liberalism. But let’s also not so easily dismiss the tremendous suffering and costs from the diseases of civilization that worsen with each generation; not only obesity, diabetes, heart disease but also autoimmune conditions, Alzheimer’s, schizophrenia, mood disorders, ADHD, autism, and on and on — besides diet and nutrition, much of it caused by chemical exposure from factory pollution, oil spills, ocean dumping, industrial farming, food additives, packaging, and environmental toxins. And we must not forget the role that governments have played in pushing harmful dietary recommendations of low-fat and high-carb that, in being spread worldwide by the wealth and power and influence of the United States, has surely harmed at least hundreds of millions over the past several generations.

The fact that sugar is more dangerous than gun powder, Coca-Cola more dangerous than al-Queda… This is not a reason to stop worrying about mass violence and direct violence. Rather than as a percentage, the total number of violent deaths is still going up, just as there are more slaves now than at the height of slavery prior to the American Civil War. Talking about percentages of certain deaths while excluding other deaths is sleight of hand rhetoric. That misses an even bigger point. The corporate plutocracy that now rules our neo-fascist society of inverted totalitarianism poses the greatest threat of our age. That is not an exaggeration. It is simply what the data shows us to be true, as Harari unintentionally reveals. Privatized profit comes at a public price, a price we can’t afford. Even ignoring the greater externalized costs of environmental harm from corporations (and the general degradation of society from worsening inequality), the increasing costs of healthcare because of diseases caused by highly-profitable and highly-processed foods that are scientifically-designed to be palatable and addictive (along with the systematic dismantling of traditional food systems) could bankrupt many countries in the near future and cripple their populations in the process. World War Three might turn out to be the least of our worries. Just because most of the costs have been externalized on the poor and delayed to future generations doesn’t mean they aren’t real. It will take a while to get the full death count.

 

State and Non-State Violence Compared

There is a certain kind of academic that simultaneously interests me and infuriates me. Jared Diamond, in The World Until Yesterday, is an example of this. He is knowledgeable guy and is able to communicate that knowledge in a coherent way. He makes many worthy observations and can be insightful. But there is also naivete that at times shows up in his writing. I get the sense that occasionally his conclusions preceded the evidence he shares. Also, he’ll point out the problems with the evidence and then, ignoring what he admitted, will treat that evidence as strongly supporting his biased preconceptions.

Despite my enjoyment of Diamond’s book, I was disappointed specifically in his discussion of violence and war (much of the rest of the book, though, is worthy and I recommend it). Among the intellectual elite, it seems fashionable right now to describe modern civilization as peaceful — that is fashionable among the main beneficiaries of modern civilization, not so much fashionable according to those who bear the brunt of the costs.

In Chapter 4, he asks, “Did traditional warfare increase, decrease, or remain unchanged upon European contact?” That is a good question. And as he makes clear, “This is not a straightforward question to decide, because if one believes that contact does affect the intensity of traditional warfare, then one will automatically distrust any account of it by an outside observer as having been influenced by the observer and not representing the pristine condition.” But he never answers the question. He simply assumes that that the evidence proves what he appears to have already believed.

I’m not saying he doesn’t take significant effort to make a case. He goes on to say, “However, the mass of archaeological evidence and oral accounts of war before European contact discussed above makes it far-fetched to maintain that people were traditionally peaceful until those evil Europeans arrived and messed things up.” The archaeological and oral evidence, like the anthropological evidence, is diverse. For example, in northern Europe, there is no evidence of large-scale warfare before the end of the Bronze Age when multiple collapsing civilizations created waves of refugees and marauders.

All the evidence shows us is that some non-state societies have been violent and others non-violent, no different than in comparing state societies. But we must admit, as Diamond does briefly, that contact and the rippling influences of contact across wide regions can lead to greater violence along with other alterations in the patterns of traditional culture and lifestyle. Before contact ever happens, most non-state societies have already been influenced by trade, disease, environmental destruction, invasive species, refugees, etc. That pre-contact indirect influences can last for generations or centuries prior to final contact, especially with non-state societies that were more secluded. And those secluded populations are the most likely to be studied as supposedly representative of uncontacted conditions.

We should be honest in admitting our vast ignorance. The problem is that, if Diamond fully admitted this, he would have little to write about on such topics or it would be a boring book with all of the endless qualifications (I personally like scholarly books filled with qualifications, but most people don’t). He is in the business of popular science and so speculation is the name of the game he is playing. Some of his speculations might not hold up to much scrutiny, not that the average reader will offer much scrutiny.

He continues to claim that, “the evidence of traditional warfare, whether based on direct observation or oral histories or archaeological evidence, is so overwhelming.” And so asks, “why is there still any debate about its importance?” What a silly question. We simply don’t know. He could be right, just as easily as he could be wrong. Speculations are dime a dozen. The same evidence can and regularly is made to conform to and confirm endless hypotheses that are mostly non-falsifiable. We don’t know and probably will never know. It’s like trying to use chimpanzees as a comparison for human nature, even though chimpanzees have for a long time been in a conflict zone with human encroachment, poaching, civil war, habitat loss, and ecosystem destabilization. No one knows what chimpanzees were like pre-contact. But we do know that bonobos that live across a major river in a less violent area express less violent behavior. Maybe there is a connection, not that Diamond is as likely to mention these kinds of details.

I do give him credit, though. He knows he is on shaky ground. In pointing out the problems he previously discussed, he writes that, “One reason is the real difficulties, which we have discussed, in evaluating traditional warfare under pre-contact or early-contact conditions. Warriors quickly discern that visiting anthropologists disapprove of war, and the warriors tend not to take anthropologists along on raids or allow them to photograph battles undisturbed: the filming opportunities available to the Harvard Peabody Expedition among the Dani were unique. Another reason is that the short-term effects of European contact on tribal war can work in either direction and have to be evaluated case by case with an open mind.” In between the lines, Jared Diamond makes clear that he can’t really know much of anything about earlier non-state warfare.

Even as he mentions some archaeological sites showing evidence of mass violence, he doesn’t clarify that these sites are a small percentage of archaeological sites, most of which don’t show mass violence. It’s not as if anyone is arguing mass violence never happened prior to civilization. The Noble Savage myth is not widely supported these days and so there is no point in his propping it up as a straw man to knock down.

From my perspective, it goes back to what comparisons one wishes to make. Non-state societies may or may not be more violent per capita. But that doesn’t change the reality that state societies cause more harm, as a total number. Consider one specific example of state warfare. The United States has been continuously at war since it was founded, which is to say not a year has gone by without war (against both state and non-state societies), and most of that has been wars of aggression. The US military, CIA covert operations, economic sanctions, etc surely has killed at least hundreds of millions of people in my lifetime — probably more people killed than all non-states combined throughout human existence.

Here is the real difference in violence between non-states and states. State violence is more hierarchically controlled and targeted in its destruction. Non-state societies, on the other hand, tend to spread the violence across entire populations. When a tribe goes to war, often the whole tribe is involved. So state societies are different in that usually only the poor and minorities, the oppressed and disadvantaged experience the harm. If you look at the specifically harmed populations in state societies, the mortality rate is probably higher than seen in non-state societies. The essential point is that this violence is concentrated and hidden.

Immensely larger numbers of people are the victims of modern state violence, overt violence and slow violence. But the academics who write about it never have to personally experience or directly observe these conditions of horror, suffering, and despair. Modern civilization is less violent for the liberal class, of which academics are members. That doesn’t say much about the rest of the global population. The permanent underclass lives in constant violence within their communities and from state governments, which leads to a different view on the matter.

To emphasize this bias, one could further note what Jared Diamond ignores or partly reports. In the section where he discusses violence, he briefly mentions the Piraha. He could have pointed out that they are a non-violent non-state society. They have no known history of warfare, capital punishment, abuse, homicide, or suicide — at least none has been observed or discovered through interviews. Does he write about this evidence that contradicts his views? Of course not. Instead, lacking any evidence of violence, he speculates about violence. Here is the passage from Chapter 2 (pp. 93-94):

“Among still another small group, Brazil’s Piraha Indians (Plate 11), social pressure to behave by the society’s norms and to settle disputes is applied by graded ostracism. That begins with excluding someone from food-sharing for a day, then for several days, then making the person live some distance away in the forest, deprived of normal trade and social exchanges. The most severe Piraha sanction is complete ostracism. For instance, a Piraha teen-ager named Tukaaga killed an Apurina Indian named Joaquim living nearby, and thereby exposed the Piraha to the risk of a retaliatory attack. Tukaaga was then forced to live apart from all other Piraha villages, and within a month he died under mysterious circumstances, supposedly of catching a cold, but possibly instead murdered by other Piraha who felt endangered by Tukaaga’s deed.”

Why did he add that unfounded speculation at the end? The only evidence he has is that their methods of social conformity are non-violent. Someone is simply ostracized. But that doesn’t fit his beliefs. So he assumes there must be some hidden violence that has never been discovered after generations of observers having lived among them. Even the earliest account of contact from centuries ago, as far as I know, indicates absolutely no evidence of violence. It makes one wonder how many more examples he ignores, dismisses, or twists to fit his preconceptions.

This reminds me of Julian Jaynes’ theory of bicameral societies. He noted that these Bronze Age societies were non-authoritarian, despite having high levels of social conformity. There is no evidence of these societies having written laws, courts, police forces, formal systems of punishment, and standing armies. Like non-state tribal societies, when they went to war, the whole population sometimes was mobilized. Bicameral societies were smaller, mostly city-states, and so still had elements of tribalism. But the point is that the enculturation process itself was powerful enough to enforce order without violence. That was only within a society, as war still happened between societies, although it was limited and usually only involved neighboring societies. I don’t think there is evidence of continual warfare. Yet when conflict erupted, it could lead to total war.

It’s hard to compare either tribes or ancient city-states to modern nation-states. Their social orders and how they maintained them are far different. And the violence involved is of a vastly disparate scale. Besides, I wouldn’t take the past half century of relative peace in the Western world as being representative of modern civilization. In this new century, we might see billions of deaths from all combined forms of violence. And the centuries earlier were some of the bloodiest and destructive ever recorded. Imperialism and colonialism, along with the legacy systems of neo-imperialism and neo-colonialism, have caused and contributed to the genocide or cultural destruction of probably hundreds of thousands of societies worldwide, in most cases with all evidence of their existence having disappeared. This wholesale massacre has led to a dearth of societies left remaining with which to make comparisons. The survivors living in isolated niches may not be representative of the societal diversity that once existed.

Anyway, the variance of violence and war casualty rates likely is greater in comparing societies of the same kind than in comparing societies of different kinds. As the nearby bonobos are more peaceful than chimpanzees, the Piraha are more peaceful than the Yanomami who live in the same region — as Canada is more peaceful than the US. That might be important to explain and a lot more interesting. But this more incisive analysis wouldn’t fit Western propaganda, specifically the neo-imperial narrative of Pax Americana. From Pax Hispanica to Pax Britannica to Pax Americana, quite possibly billions of combatants have died in wars and billions more of innocents as casualties. That is neither a small percentage nor a small total number, if anyone is genuinely concerned about body counts.

* * *

Rebutting Jared Diamond’s Savage Portrait
by Paul Sillitoe & Mako John Kuwimb, iMediaEthics

Why Does Jared Diamond Make Anthropologists So Mad?
by Barbara J. King, NPR

In a beautifully written piece for The Guardian, Wade Davis says that Diamond’s “shallowness” is what “drives anthropologists to distraction.” For Davis, geographer Diamond doesn’t grasp that “cultures reside in the realm of ideas, and are not simply or exclusively the consequences of climatic and environmental imperatives.”

Rex Golub at Savage Minds slams the book for “a profound lack of thought about what it would mean to study human diversity and how to make sense of cultural phenomena.” In a fit of vexed humor, the Wenner-Gren Foundation for anthropological research tweeted Golub’s post along with this comment: “@savageminds once again does the yeoman’s work of exploring Jared Diamond’s new book so the rest of us don’t have to.”

This biting response isn’t new; see Jason Antrosio’s post from last year in which he calls Diamond’s Pulitzer Prize-winning Guns, Germs, and Steel a “one-note riff,” even “academic porn” that should not be taught in introductory anthropology courses.

Now, in no way do I want to be the anthropologist who defends Diamond because she just doesn’t “get” what worries all the cool-kid anthropologists about his work. I’ve learned from their concerns; I’m not dismissing them.

In point of fact, I was startled at this passage on the jacket of The World Until Yesterday: “While the gulf that divides us from our primitive ancestors may seem unbridgably wide, we can glimpse most of our former lifestyle in those largely traditional societies that still exist or were recently in existence.” This statement turns small-scale societies into living fossils, the human equivalent of ancient insects hardened in amber. That’s nonsense, of course.

Lest we think to blame a publicist (rather than the author) for that lapse, consider the text itself. Near the start, Diamond offers a chronology: until about 11,000 years ago, all people lived off the land, without farming or domesticated animals. Only around 5,400 years ago did the first state emerge, with its dense population, labor specialization and power hierarchy. Then Diamond fatally overlays that past onto the present: “Traditional societies retain features of how all of our ancestors lived for tens of thousands of years, until virtually yesterday.” Ugh.

Another problem, one I haven’t seen mentioned elsewhere, bothers me just as much. When Diamond urges his WEIRD readers to learn from the lifeways of people in small-scale societies, he concludes: “We ourselves are the only ones who created our new lifestyles, so it’s completely in our power to change them.” Can he really be so unaware of the privilege that allows him to assert — or think — such a thing? Too many people living lives of poverty within industrialized nations do not have it “completely in their power” to change their lives, to say the least.

Patterns of Culture by Ruth Benedict (1934) wins Jared Diamond (2012)
by Jason Antrosio, Living Anthropologically

Compare to Jared Diamond. Diamond has of course acquired some fame for arguing against biological determinism, and his Race Without Color was once a staple for challenging simplistic tales of biological race. But by the 1990s, Diamond simply echoes perceived liberal wisdom. Benedict and Weltfish’s Races of Mankind was banned by the Army as Communist propaganda, and Weltfish faced persecution from McCarthyism (Micaela di Leonardo, Exotics at Home 1998:196,224; see also this Jon Marks comment on Gene Weltfish). Boas and Benedict swam against the current of the time, when backlash could be brutal. In contrast, Diamond’s claims on race and IQ have mostly been anecdotal. They have never been taken seriously by those who call themselves “race realists” (see Jared Diamond won’t beat Mitt Romney). Diamond has never responded scientifically to the re-assertion of race from sources like “A Family Tree in Every Gene,” and he helped propagate a medical myth about racial differences in hypertension.

And, of course, although Guns, Germs, and Steel has been falsely branded as environmental or geographical determinism, there is no doubt that Diamond leans heavily on agriculture and geography as explanatory causes for differential success. […]

Compare again Jared Diamond. Diamond has accused anthropologists of falsely romanticizing others, but by subtitling his book What Can We Learn from Traditional Societies, Diamond engages in more than just politically-correct euphemism. When most people think of a “traditional society,” they are thinking of agrarian peasant societies or artisan handicrafts. Diamond, however, is referring mainly to what we might term tribal societies, or hunters and gatherers with some horticulture. Curiously, for Diamond the dividing line between the yesterday of traditional and the today of the presumably modern was somewhere around 5,000-6,000 years ago (see The Colbert Report). As John McCreery points out:

Why, I must ask, is the category “traditional societies” limited to groups like Inuit, Amazonian Indians, San people and Melanesians, when the brute fact of the matter is that the vast majority of people who have lived in “traditional” societies have been peasants living in traditional agricultural civilizations over the past several thousand years since the first cities appeared in places like the valleys of the Nile, the Tigris-Euphrates, the Ganges, the Yellow River, etc.? Talk about a big blind spot.

Benedict draws on the work of others, like Reo Fortune in Dobu and Franz Boas with the Kwakiutl. Her own ethnographic experience was limited. But unlike Diamond, Benedict was working through the best ethnographic work available. Diamond, in contrast, splays us with a story from Allan Holmberg, which then gets into the New York Times, courtesy of David Brooks. Compare bestselling author Charles Mann on “Holmberg’s Mistake” (the first chapter of his 1491: New Revelations of the Americas Before Columbus):

The wandering people Holmberg traveled with in the forest had been hiding from their abusers. At some risk to himself, Holmberg tried to help them, but he never fully grasped that the people he saw as remnants from the Paleolithic Age were actually the persecuted survivors of a recently shattered culture. It was as if he had come across refugees from a Nazi concentration camp, and concluded that they belonged to a culture that had always been barefoot and starving. (Mann 2005:10)

As for Diamond’s approach to comparing different groups: “Despite claims that Diamond’s book demonstrates incredible erudition what we see in this prologue is a profound lack of thought about what it would mean to study human diversity and how to make sense of cultural phenomenon” (Alex Golub, How can we explain human variation?).

Finally there is the must-read review Savaging Primitives: Why Jared Diamond’s ‘The World Until Yesterday’ Is Completely Wrong by Stephen Corry, Director of Survival International:

Diamond adds his voice to a very influential sector of American academia which is, naively or not, striving to bring back out-of-date caricatures of tribal peoples. These erudite and polymath academics claim scientific proof for their damaging theories and political views (as did respected eugenicists once). In my own, humbler, opinion, and experience, this is both completely wrong–both factually and morally–and extremely dangerous. The principal cause of the destruction of tribal peoples is the imposition of nation states. This does not save them; it kills them.

[…] Indeed, Jared Diamond has been praised for his writing, for making science popular and palatable. Others have been less convinced. As David Brooks reviews:

Diamond’s knowledge and insights are still awesome, but alas, that vividness rarely comes across on the page. . . . Diamond’s writing is curiously impersonal. We rarely get to hear the people in traditional societies speak for themselves. We don’t get to meet any in depth. We don’t get to know what their stories are, what the contents of their religions are, how they conceive of individual selfhood or what they think of us. In this book, geographic and environmental features play a much more important role in shaping life than anything an individual person thinks or feels. The people Diamond describes seem immersed in the collective. We generally don’t see them exercising much individual agency. (Tribal Lessons; of course, Brooks may be smarting from reviews that called his book The Dumbest Story Ever Told)

[…] In many ways, Ruth Benedict does exactly what Wade Davis wanted Jared Diamond to do–rather than providing a how-to manual of “tips we can learn,” to really investigate the existence of other possibilities:

The voices of traditional societies ultimately matter because they can still remind us that there are indeed alternatives, other ways of orienting human beings in social, spiritual and ecological space. This is not to suggest naively that we abandon everything and attempt to mimic the ways of non-industrial societies, or that any culture be asked to forfeit its right to benefit from the genius of technology. It is rather to draw inspiration and comfort from the fact that the path we have taken is not the only one available, that our destiny therefore is not indelibly written in a set of choices that demonstrably and scientifically have proven not to be wise. By their very existence the diverse cultures of the world bear witness to the folly of those who say that we cannot change, as we all know we must, the fundamental manner in which we inhabit this planet. (Wade Davis review of Jared Diamond; and perhaps one of the best contemporary versions of this project is Wade Davis, The Wayfinders: Why Ancient Wisdom Matters in the Modern World)

[…] This history reveals the major theme missing from both Benedict’s Patterns of Culture and especially missing from Diamond–an anthropology of interconnection. That as Eric Wolf described in Europe and the People Without History peoples once called primitive–now perhaps more politely termed tribal or traditional–were part of a co-production with Western colonialism. This connection and co-production had already been in process long before anthropologists arrived on the scene. Put differently, could the Dobuan reputation for being infernally nasty savages have anything to do with the white recruiters of indentured labour, which Benedict mentions (1934:130) but then ignores? Could the revving up of the Kwakiutl potlatch and megalomaniac gamuts have anything to do with the fur trade?

The Collapse Of Jared Diamond
by Louis Proyect, Swans Commentary

In general, the approach of the authors is to put the ostensible collapse into historical context, something that is utterly lacking in Diamond’s treatment. One of the more impressive record-correcting exercises is Terry L. Hunt and Carl P. Lipo’s Ecological Catastrophe, Collapse, and the Myth of “Ecocide” on Rapa Nui (Easter Island). In Collapse, Diamond judged Easter Island as one of the more egregious examples of “ecocide” in human history, a product of the folly of the island’s rulers whose decision to construct huge statues led to deforestation and collapse. By chopping down huge palm trees that were used to transport the stones used in statue construction, the islanders were effectively sealing their doom. Not only did the settlers chop down trees, they hunted the native fauna to extinction. The net result was a loss of habitat that led to a steep population decline.

Diamond was not the first observer to call attention to deforestation on Easter Island. In 1786, a French explorer named La Pérouse also attributed the loss of habitat to the “imprudence of their ancestors for their present unfortunate situation.”

Referring to research about Easter Island by scientists equipped with the latest technologies, the authors maintain that the deforestation had nothing to do with transporting statues. Instead, it was an accident of nature related to the arrival of rats in the canoes of the earliest settlers. Given the lack of native predators, the rats had a field day and consumed the palm nuts until the trees were no longer reproducing themselves at a sustainable rate. The settlers also chopped down trees to make a space for agriculture, but the idea that giant statues had anything to do with the island’s collapse is as much of a fiction as Diamond’s New Yorker article.

Unfortunately, Diamond is much more interested in ecocide than genocide. If people interested him half as much as palm trees, he might have said a word or two about the precipitous decline in population that occurred after the island was discovered by Europeans in 1722. Indeed, despite deforestation there is evidence that the island’s population grew between 1250 and 1650, the period when deforestation was taking place — leaving aside the question of its cause. As was the case when Europeans arrived in the New World, a native population was unable to resist diseases such as smallpox and died in massive numbers. Of course, Diamond would approach such a disaster with his customary Olympian detachment and write it off as an accident of history.

While all the articles pretty much follow the narrowly circumscribed path as the one on Easter Island, there is one that adopts the Grand Narrative that Jared Diamond has made a specialty of and beats him at his own game. I am referring to the final article, Sustainable Survival by J.R. McNeill, who describes himself in a footnote thusly: “Unlike most historians, I have no real geographic specialization and prefer — like Jared Diamond — to hunt for large patterns in the human past.”

And one of those “large patterns” ignored by Diamond is colonialism. The greatest flaw in Collapse is that it does not bother to look at the impact of one country on another. By treating countries in isolation from one another, it becomes much easier to turn the “losers” into examples of individual failing. So when Haiti is victimized throughout the 19th century for having the temerity to break with slavery, this hardly enters into Diamond’s moral calculus.

Compassion Sets Humans Apart
by Penny Spikins, Sapiens

There are, perhaps surprisingly, only two known cases of likely interpersonal violence in the archaic species most closely related to us, Neanderthals. That’s out of a total of about 30 near-complete skeletons and 300 partial Neanderthal finds. One—a young adult living in what is now St. Césaire, France, some 36,000 years ago—had the front of his or her skull bashed in. The other, a Neanderthal found in Shanidar Cave in present-day Iraq, was stabbed in the ribs between 45,000 and 35,000 years ago, perhaps by a projectile point shot by a modern human.

The earliest possible evidence of what might be considered warfare or feuding doesn’t show up until some 13,000 years ago at a cemetery in the Nile Valley called Jebel Sahaba, where many of the roughly 60 Homo sapiens individuals appear to have died a violent death.

Evidence of human care, on the other hand, goes back at least 1.5 million years—to long before humans were anatomically modern. A Homo ergaster female from Koobi Fora in Kenya, dated to about 1.6 million years ago, survived several weeks despite a toxic overaccumulation of vitamin A. She must have been given food and water, and protected from predators, to live long enough for this disease to leave a record in her bones.

Such evidence becomes even more notable by half a million years ago. At Sima de los Huesos (Pit of Bones), a site in Spain occupied by ancestors of Neanderthals, three of 28 individuals found in one pit had severe pathology—a girl with a deformed head, a man who was deaf, and an elderly man with a damaged pelvis—but they all lived for long periods of time despite their conditions, indicating that they were cared for. At the same site in Shanidar where a Neanderthal was found stabbed, researchers discovered another skeleton who was blind in one eye and had a withered arm and leg as well as hearing loss, which would have made it extremely hard or impossible to forage for food and survive. His bones show he survived for 15 to 20 years after injury.

At a site in modern-day Vietnam called Man Bac, which dates to around 3,500 years ago, a man with almost complete paralysis and frail bones was looked after by others for over a decade; he must have received care that would be difficult to provide even today.

All of these acts of caring lasted for weeks, months, or years, as opposed to a single moment of violence.

Violence, Okinawa, and the ‘Pax Americana’
by John W. Dower, The Asia-Pacific Journal

In American academic circles, several influential recent books argue that violence declined significantly during the Cold War, and even more precipitously after the demise of the Soviet Union in 1991. This reinforces what supporters of US strategic policy including Japan’s conservative leaders always have claimed. Since World War II, they contend, the militarized Pax Americana, including nuclear deterrence, has ensured the decline of global violence.

I see the unfolding of the postwar decades through a darker lens.

No one can say with any certainty how many people were killed in World War II. Apart from the United States, catastrophe and chaos prevailed in almost every country caught in the war. Beyond this, even today criteria for identifying and quantifying war-related deaths vary greatly. Thus, World War II mortality estimates range from an implausible low of 50 million military and civilian fatalities worldwide to as many as 80 million. The Soviet Union, followed by China, suffered by far the greatest number of these deaths.

Only when this slaughter is taken as a baseline does it make sense to argue that the decades since World War II have been relatively non-violent.

The misleading euphemism of a “Cold War” extending from 1945 to 1991 helps reinforce the decline-of-violence argument. These decades were “cold” only to the extent that, unlike World War II, no armed conflict took place pitting the major powers directly against one another. Apart from this, these were years of mayhem and terror of every imaginable sort, including genocides, civil wars, tribal and ethnic conflicts, attempts by major powers to suppress anti-colonial wars of liberation, and mass deaths deriving from domestic political policies (as in China and the Soviet Union).

In pro-American propaganda, Washington’s strategic and diplomatic policies during these turbulent years and continuing to the present day have been devoted to preserving peace, defending freedom and the rule of law, promoting democratic values, and ensuring the security of its friends and allies.

What this benign picture ignores is the grievous harm as well as plain folly of much postwar US policy. This extends to engaging in atrocious war conduct, initiating never-ending arms races, supporting illiberal authoritarian regimes, and contributing to instability and humanitarian crises in many part of the world.

Such destructive behavior was taken to new levels in the wake of the September 11, 2001, attack on the World Trade Center and Pentagon by nineteen Islamist hijackers. America’s heavy-handed military response has contributed immeasurably to the proliferation of global terrorist organizations, the destabilization of the Greater Middle East, and a flood of refugees and internally displaced persons unprecedented since World War II.

Afghanistan and Iraq, invaded following September 11, remain shattered and in turmoil. Neighboring countries are wracked with terror and insurrection. In 2016, the last year of Barack Obama’s presidency, the US military engaged in bombing and air strikes in no less than seven countries (Afghanistan, Iraq, Pakistan, Somalia, Yemen, Libya, and Syria). At the same time, elite US “special forces” conducted largely clandestine operations in an astonishing total of around 140 countries–amounting to almost three-quarters of all the nations in the world.

Overarching all this, like a giant cage, is America’s empire of overseas military bases. The historical core of these bases in Germany, Japan, and South Korea dates back to after World War II and the Korean War (1950-1953), but the cage as a whole spans the globe and is constantly being expanded or contracted. The long-established bases tend to be huge. Newer installations are sometimes small and ephemeral. (The latter are known as “lily pad” facilities, and now exist in around 40 countries.) The total number of US bases presently is around 800.

Okinawa has exemplified important features of this vast militarized domain since its beginnings in 1945. Current plans to relocate US facilities to new sites like Henoko, or to expand to remote islands like Yonaguni, Ishigaki, and Miyako in collaboration with Japanese Self Defense Forces, reflect the constant presence but ever changing contours of the imperium. […]

These military failures are illuminating. They remind us that with but a few exceptions (most notably the short Gulf War against Iraq in 1991), the postwar US military has never enjoyed the sort of overwhelming victory it experienced in World War II. The “war on terror” that followed September 11 and has dragged on to the present day is not unusual apart from its seemingly endless duration. On the contrary, it conforms to this larger pattern of postwar US military miscalculation and failure.

These failures also tell us a great deal about America’s infatuation with brute force, and the double standards that accompany this. In both wars, victory proved elusive in spite of the fact that the United States unleashed devastation from the air greater than anything ever seen before, short of using nuclear weapons.

This usually comes as a surprise even to people who are knowledgeable about the strategic bombing of Germany and Japan in World War II. The total tonnage of bombs dropped on Korea was four times greater than the tonnage dropped on Japan in the US air raids of 1945, and destroyed most of North Korea’s major cities and thousands of its villages. The tonnage dropped on the three countries of Indochina was forty times greater than the tonnage dropped on Japan. The death tolls in both Korea and Indochina ran into the millions.

Here is where double standards enter the picture.

This routine US targeting of civilian populations between the 1940s and early 1970s amounted to state-sanctioned terror bombing aimed at destroying enemy morale. Although such frank labeling can be found in internal documents, it usually has been taboo in pro-American public commentary. After September 11, in any case, these precedents were thoroughly scrubbed from memory.

“Terror bombing” has been redefined to now mean attacks by “non-state actors” motivated primarily by Islamist fundamentalism. “Civilized” nations and cultures, the story goes, do not engage in such atrocious behavior. […]

Nuclear weapons were removed from Okinawa after 1972, and the former US and Soviet nuclear arsenals have been substantially reduced since the collapse of the USSR. Nonetheless, today’s US and Russian arsenals are still capable of destroying the world many times over, and US nuclear strategy still explicitly targets a considerable range of potential adversaries. (In 2001, under President George W. Bush, these included China, Russia, Iraq, Iran, North Korea, Syria, and Libya.)

Nuclear proliferation has spread to nine nations, and over forty other countries including Japan remain what experts call “nuclear capable states.” When Barack Obama became president in 2009, there were high hopes he might lead the way to eliminating nuclear weapons entirely. Instead, before leaving office his administration adopted an alarming policy of “nuclear modernization” that can only stimulate other nuclear nations to follow suit.

There are dynamics at work here that go beyond rational responses to perceived threats. Where the United States is concerned, obsession with absolute military supremacy is inherent in the DNA of the postwar state. After the Cold War ended, US strategic planners sometimes referred to this as the necessity of maintaining “technological asymmetry.” Beginning in the mid 1990s, the Joint Chiefs of Staff reformulated their mission as maintaining “full spectrum dominance.”

This envisioned domination now extends beyond the traditional domains of land, sea, and air power, the Joint Chiefs emphasized, to include space and cyberspace as well.