How did American English become standardized?

Someone asked me about General American (GA) dialect, sometimes called Standard American. This person specifically asked, “In the 30’s to 60, there was the transatlantic accent, but I was wondering when general american became the norm for tv / movies?”

General American is a variant of American Midland dialect. It’s considered to have its most representative form in a small area of the far western Lower Midwest, mostly but not entirely west of the upper Mississippi River: central-to-southern Iowa, northern Missouri, eastern Nebraska, and northwestern Illinois. Major mainstream media figures such as Ronald Reagan and Walter Cronkite came from this part of the country, Illinois and Missouri respectively.

The archetype of GA in broadcasting was Edward Murrow who was born in North Carolina but early on moved to the rhotic region of the Pacific Northwest, specifically Washington state. According to Thomas Paul Bonfiglio (Race and the Rise of Standard American, pp. 173-4), Murrow’s “nightly radio audience was estimated to be 15,000,000 listeners” and widely considered “the foremost American correspondent of that era.” Murrow’s career took off during WWII when America’s image of greatness finally took form (with the help of the destruction of Europe), and the voice that came to be identified with this new great America was that of GA-speaking Edward Murrow. He helped train and inspire an entire generation of broadcasters that followed him. Bonfiglio then states that,

Those who were hired and trained by Murrow in turn hired and trained Walter Cronkite, Mike Wallace, Harry REasoner, Roger Mudd, Dan Rather, and Chet Hunley (230). Walter Cronkite, who was often characterized as the most trusted man in America, characterized himself as a “direct descendent of the Murrow tradition”

There are variants of GA found across the Midwest, in the Far West, and along the West Coast. Many people working in radio, television, and movies speak GA—whether or not it was the dialect they spoke growing up. GA became standard because of a number of reasons, besides those already mentioned.

Let me begin with a discussion of the Midwest.

The Midwest for a long time has been the median and mean of the population in the United States. Between great soil and plentiful water for agriculture and industry, it attracted most of the immigrant population since the 1800s. Even most people heading further west passed through this region. For this reason, the early railroads were built heavily in the Midwest. Chicago, in particular, was in the past the hub of America. The ‘Midwest’ symbolically is quite broad, imaginatively encompassing almost the entirety of the American interior.

The Midwest was increasingly where large audiences could be reached, an important factor in early broadcasting. Another important factor was that the area of GA is most equidistant from all other areas of the country, and so the dialect is the most familiar to most Americans—i.e., it sounds neutral, as if without accent.

Some have gone so far as to argue that GA is inherently more of a ‘neutral‘ accent in that it is easier to speak or sing for most people; and if that were the case, it could have helped it have spread more easily. Interestingly, GA is in some ways closer to early British English than is contemporary British English, as rhotic pronunciation of ‘r’ sounds used to be the norm for British English and still is for GA. Rhotic English, in the United States, is also what distinguishes (Mid-)Western dialect from Eastern and Southern dialect.

By the way, Reagan worked in Midwestern broadcast radio before he became a Hollywood actor. Strangely, quite a few cowboy actors came from or near the area of GA dialect, such as John Wayne from southern Iowa (his father having been from Illinois and his mother from Nebraska). Wayne has a way of speaking that is hard to pinpoint regionally, other than it sounding vaguely ‘Western’, definitely not Eastern or Southern.

GA took longer to take hold in entertainment media, as regional dialects remained popular in many television shows. In 1934, there was the first “syndicated programming, including The Lone Ranger and Amos ‘n’ Andy” (Radio in the United States). It was news broadcasters that helped make GA the norm for the country, although even this took a while (Bonfiglio, p. 58): “Even in the late thirties, the idea of a standard American English had not yet been located in a specific region, and a sort of linguistic relativism in the field of pronunciation prevailed.” Besides those named above, there were others such Clifton Garrick Utley (along with his mother and father who also worked for NBC) and Vincent Pelletier or, even over in Ohio, someone like Lowell Jackson Thomas. Midwestern broadcasters like this only gained wider national audiences starting in the 1940s, and so they helped to define the emerging perception of a Standard American or General American dialect. The world war era helped fuel the seeking of a national identity and hence a national way of speaking. It helped that Western broadcasters like Edward Murrow similarly spoke rhotic GA.

Plus, the Midwest developed the only thriving regional public radio, partly because of the large number of land grant colleges. It’s not that public radio initially was all that important nationally. But it had great influence in the region. And it probably had some later influence on the eventual establishment of National Public Radio.

Still, early broadcasters do sound different than today. Even Cronkite in the beginning of his career had a more clipped style. This had less to do with regional dialect and maybe more to do with the medium itself at the time—as dthrasher explained: “I’d guess that the “50’s accent” you hear had much to do with the technology of AM and shortwave radio. Precise diction and a somewhat clipped style for words and phrases helped to overcome the crackle and hiss of static in radio reception.” He also points out “that many movie and television stars of that era got their start in theater,” a less casual way of speaking, but I’m not sure how much influence that would have had on the field of broadcasting.

What exactly changed, besides technology, in the mid-20th century? Bonfiglio emphasizes that there was a growing desire for standardization in the 1940s. An obvious reason for this was the rise of the public school movement as part of the response to the perceived threat of ethnic immigrants who weren’t assimilating fast enough for many WASPs. As Bonfiglio writes (p. 59):

In 1944, the New York State Department of Education formed a committee to decide on standards of pronunciation to be taught in public schools (C. K. Thomas 1945). The committee was comprised of over a dozen national language experts, who decided that the pupils should all become acquainted with the three types of American pronunciation: “Eastern, Southern and General American.”

So, it wasn’t (Mid-)Westerners declaring themselves as speaking General American. Apparently, even those outside of the (Mid-)West acknowledged that there was this broadly American dialect that was neither Eastern nor Southern. But why did this matter?

The South obviously wouldn’t become the standard because it is the region that started and lost the Civil War. Besides, the South didn’t have a large concentrated population as did the North, a major reason for their having been overwhelmed by the Union army. That still leaves the upper East Coast region, as it did initially dominate early entertainment media. The mid-Atlantic consisted of a massive population, from the 1800s into the early 1900s. The problem was that this massive population was also massively diverse, with a large influx of Southern and Eastern Europeans, including many non-Protestants (Jews, Catholics, and Eastern Orthodox).

This led many to look to the (Mid-)West for a ‘real’ American identity, probably related to the growing popularity of movie Westerns and all that they mythologized in the public mind. Americans early on came to symbolize their aspirational identity with the West, the Midwest being the first American West. A state like Iowa, west of the upper Mississippi River, was a clear demarcation point for where dialect was most distinct from the East and South, a place where there were few Jews and blacks.

The rhotic dialect was quite broadly distributed in the Western United States, even being heard from a Texan like Dan Rather, though it is true his mother and her family came from Indiana—it does make me wonder what dialect he spoke as a child and young adult. It should be noted that Texas received a fair amount of German immigrants, many having passed through the Midwest before settling in Texas. Then there are other broadcasters such as Tom Brokaw from South Dakota and Peter Jennings from Canada, both areas of rhotic accent among other shared linguistic characteristics. Standard Canadian English is closely related to Standard American English and, indeed, there was much early immigration between Canada and (Mid-)Western United States.

Following the Civil War and into the 20th century, the population was simultaneously growing in the Midwest and West Coast. This represented the future of the country, not just major agricultural regions but the emergence of major industries and new centers of media.

The first movie shot in Hollywood happened in 1910. That was a silent movie and hence accent wasn’t yet an issue. It would be a couple of decades before films with sound became common. I was reading that it was WWI that disrupted the film production in other countries. With California becoming an emerging center, the studio system and star system having developed there.

The numbers moving westward increased vastly following the Great Depression and the Dust Bowl. Many of those who ended up in California came from the Midwest, the area of the greatest population and the origin of what has come to be called Standard American English.

The far Middle West accent had already established itself as important. The earliest radio broadcasters that reached the largest numbers of listeners often came from the Midwest or otherwise similarly speaking regions. When so many Midwesterners moved to California, they brought their accent with them. Midwestern broadcasters like Ronald Reagan sometimes became movie stars. Consider also the stereotypical California surfer dude made famous through Hollywood movies. Many of the movie stars and movie extras were of German and Scandinavian ancestry, which had been concentrated in the Midwest. Beach movies came to replace Westerns, but I’m not sure how that might have played into changing attitudes about General American.

The boom of the defense industry and population in California after WWII made it a more important center of culture and media. California even became the center of a religious movement that would take the country by a storm, the new mega-churches that reached massive television audiences. One of these California preachers was Robert H. Schuller who was born and raised in Iowa.

I suppose it took decades for the new accent to become more common mainstream media. By the 1990s, Standard American English definitely had won out as the new dominant accent for the country. It was becoming more common in the 1980s tv, such as with Roseanne which began in 1988. New York City is still a major media center, but it is mostly now known for print media. Even so, there remains a media nostalgia in making movies about New York City, whether or not they are still made there.

The transition to GA dominance wasn’t an accident. There were demographic reasons that made it more probable. But it must be noted that many intentionally promoted it. The Midwest represented a tradition that simultaneously included immigrant diversity and assimilation. This tradition at times was promoted quite forcefully, such as by Klansmen of the Second Klan who hated non-WASP ethnic-Americans (i.e., hyphenated Americans). Mainstream media corporations as gatekeepers were quite self-conscious in their establishing English standardization. The media companies, as stated by Bonfiglio, went so far as to hire professionals from the early speech correction field to teach their broadcasters to speak this at the time newly emerging mainstream standard of American English.

The person who posed the question to me about General American, followed up with this comment: “Even Rosanne doesn’t sound all GA to me. And John Goodman sounds southernish. Was just wondering. I notice some say that after 60s black and white tv it became standard. But I really don’t see that to be the case at all.”

The Roseanne cast had a diverse group of actors. Roseanne Barr was born in Utah, but when she was still young she moved to Colorado which is partly in the Midlands dialect region—her accent is a mix. Several of the other people on the show were born in the Midwest, specifically three from Illinois and one from Michigan. A few were from California and probably spoke more GA, although it’s been a long time since I’ve watched the show.

John Goodman was born in St. Louis, Missouri—what many would consider as culturally part of the Midwest, although there is a Southern influence in Missouri. I’ve even heard a Southern accent in southeast Iowa, from someone who lived just across the Mississipi River. Western Illinois and northern Missouri are part of the specific subset of Midlands dialect (i.e., pure GA) that has become so well known in the mainstream media.

My mother grew up in the Midlands region, central Indiana to be precise. Even she had a Southern-like accent when she was younger, the Hoosier accent that is akin to what is heard in the Upper (Mountain) South. She lost it early on in and now speaks GA. As a speech pathologist, it was part of her job to teach students to speak GA.

I spent many formative years right in the heart of the heart of General American. Even after spending years in the South, it didn’t take long to start speaking GA once I was back in Iowa. It drove my mother wild when I picked up some Southern dialect and she would correct my language, as is her habit. Maybe she was happy when I returned to speaking solid Midwestern dialect.

About early television shows, one to consider is Happy Days. It was set in Milwaukee, Wisconsin. One of the actors was from Wisconsin. Some others were from Minnesota, Oklahoma, illinois, and California. There were a few New Yorkers in that cast as well.

Oddly, one of its spin off shows, Laverne & Shirley, was also supposedly set in the same Milwaukee location. But it’s cast was overwhelmingly from New York. Another spin off from Happy Days was Mork & Mindy, which was supposed to be set in Boulder, Colorado. The two main actors were from Illinois and Michigan, Robin Williams being from Chicago. Of the rest of the cast, two were from Ohio, two from Texas, and one from New York.

From my childhood and young adulthood, there were popular shows like The Wonder Years. The main actor and the actress playing his mother were both from Illinois. By the time that show was on, it probably didn’t matter where actors/actresses came from. Most of them were learning to speak GA. It was probably in California, not the Midwest, where most people in entertainment media learned to speak GA. A Southerner like Stephen Colbert is a good example of someone losing a distinctly regional accent in order to speak GA, although he probably didn’t need to go to California.

If I had to guess, GA came to dominate news reporting and Hollywood movies before it came to dominate tv shows. I’m not sure why that might be. If that is the case, your guess would be as good as mine. One guess might be that tv shows never drew as large of audiences and so General American was less important. New reporting once it became national, on the other hand, demanded an accent that was understandable to the most people. Hollywood movies likewise had larger and more diverse audiences.

According to one theory, General American simply happens to be the accent most Americans can understand the most easily and clearly. Bonfiglio, however, considers that to be an ethnocentric and racist rationalization for the dominance of the (Mid-)Western equivalent of the Aryan race, that perceived superior mix of Anglo-Saxon British and Northern European ancestries. Maybe so or maybe not.

About my mother’s career as a speech pathologist teaching ‘proper’ GA English, my interlocutor then asked the following set of questions, “Just wondering, what era was this? I just find it odd when I watch so much 80s tv and movies, GA isn’t used. What did she teach them for? And was the GA that she taught the one that you mention today? was the accent even remotely similar to what we consider GA today?”

Having been born in the 1940s, my mother started work in the late 1960s and continued until the 2000s. So, she grew up and worked in the precise period of GA dialect fully taking over.

I talked to my mother. We discussed the changes in her own speech.

She doesn’t clearly remember having a Southern-sounding accent or rather a Hoosier accent, but it clearly can be heard on an old audio of her from back in the late 1960s, in the time of her life when she had recently finished college and had begun her career as a speech pathologist.

I asked her if her professors spoke GA. She said that they probably did. She does remember when she was younger that she pronounced in the same way the words ‘pool’, ‘pull’, and ‘pole’. And, when she was in college, a professor corrected her for saying ‘bof’ in place of ‘both’. My mother still will occasionally fall into Hoosier dialect by saying ‘feesh’ for ‘fish’ and ‘cooshion’ for ‘cushion’, the latter example happens commonly in her everyday speaking.

For the most part, my mom speaks GA these days. There is no hint of a Hoosier accent. And, around strangers, she is probably more careful in not using those Hoosier pronunciations. But, even as late as the early 1980s, some people in northern Illinois told my mother that she had what to them sounded like a slight Southern accent. For the time we lived in Illinois and Iowa, we were in the area of GA which probably helped my mom lose what little she had of her childhood dialect.

I also asked my mother about her career as a speech pathologist. She said that early on she thought little about dialect, either in her own speaking or that of students. She did work for a few years in the Deep South before I was born, when my dad was stationed at a military base. She would have corrected both black and white Southern children without any thought about it. Compared to Deep Southern dialect, I’m sure my mother even when young sounded Midwestern, an approximation of the rhotic GA dialect.

It was the late 1980s when our family moved to the South Carolina. My mother said that is the first time she was told to not correct the dialect of black students. She still did tell her black students the different ways to pronounce sounds and words and she modeled GA, but she couldn’t technically teach them proper English. At that time, she also wasn’t allowed to work with kids who had English as a second language, for there were separate ESL teachers. Yet, back in the early 1980s, she worked with some Hispanic students in order to teach them proper English.

Until South Carolina, she says she never considered dialect in terms of her speech work. It seems that the language professions were rather informal until later in her career. She spent the longest part of her career in South Carolina where she worked for two decades. Her field had become extremely professionalized at that point and all the language fields were territorial about the students they worked with and the type of language issues they specialized in.

So, my mother’s own way of speaking English changed over her career as the way she taught language changed. By the end of her career, she says even a speech pathologist from the South and working in the South with Southern students would have taught GA, at least to white students and probably informally to black students as well. She said that speech pathologists ended up teaching code switching, in that they taught kids that there were multiple ways of speaking words. She pointed out that many older blacks she worked with, including a principal, didn’t code switch—that makes sense, as they probably were never taught to do so.

My mother’s career wasn’t directly involved in dialect and accent. She was a speech pathologist which means she largely focused on teaching articulation. She never thought of it as teaching kids GA, even if that was the end result.

That field is interesting. When my mother started, it was called speech correction. Then early in her career it was called speech therapy. But now it is speech-language pathology. The change of name correlated to changes in what was being taught in the field.

I don’t know if General American itself changed over time. It’s interesting to note that many of the earliest speech centers and speech corrections/therapy schools in the US were in the Midwest, where many of the pioneers (e.g., Charles Van Riper) in the field came from—such as Michigan, Wisconsin, and Iowa. Right here in the town I live in, Iowa City, was one of the most influential programs and one of the main professors in that program was born in Iowa City, Dean Williams. As my mother audited one of Williams’ classes, she got to know him and he worked with my brother’s stuttering. Interestingly, Williams himself came in contact with the field because of his own childhood stuttering, when Wendell Johnson helped him. My mother heard Williams say that, while he was in the military during WWII, Johnson sent him speech journals as reading material which inspired him to enter the field when he returned after the war.

So, it appears at least some of the speech fields in the US developed in or near the area of General American dialect. Maybe that is because of the large non-English immigrant populations that settled in the Midwest. German-Americans were the largest demographic in the early 20th century and, accordingly, to mainstream WASP culture this was one of the greatest threats. Even in a college town like Iowa City, the Czechs felt compelled to start their own Catholic church because they couldn’t understand the priest at the German Catholic church. Assimilation was slow to take hold within ethnic immigrant communities. Language standardization and speech correction became a priority for the purveyors of the dominant culture.

Let me point out one thing in relation to my mother. She went to Purdue. The head of her department was Max David Steer, having been in that position from 1963 to 1970, the exact years my mother spent at Purdue. He was a New Yorker, but he got his Ph.D. from the University of Iowa here in Iowa City. Like Williams, he probably also learned under Johnson. The field was small at that time and all of these figures would have known each other.

Here is an amusing side note.

My mother began her education when the field was in transition. Speech corrections/therapy had only been a field distinct from psychology since after WWII, although the program at Purdue started the same year my mother started school, 1963. When she got her masters degree, 1969-70, they had just begun teaching transformational linguistic theory. She says it was highly theoretical and way over her head. Guess who was one of the major influences on this development: the worldwide infamous left-winger, Noam Chomsky. So, my mother learned a bit about Chomskyan linguistic theory back in the day.

By the way, listening to Chomsky speak, it definitely is more or less GA. He grew up in Pennsylvania. It was Pennsylvanian culture that some argue was the greatest influence on Midwestern culture. This is because so many early immigrants entered the United States through Pennsylvania and from there settled in the Midwest. But there is a definite accent that can be found among many Pennsylvanian natives. It’s possible that Chomsky picked up the GA dialect later in life. Anyway, he personifies the neutral/objective-sounding intellectuality of GA in its most standardized mainstream form—so straightforward and unimposing, at least in the way Chomsky speaks it.

I get the sense that, going back far enough, few overtly worried about standardized English. It was simply considered proper English, at least by the mid-20th century. I have no idea when it first became considered proper English in the US. If I had to hazard a guess, the world war era probably helped to establish and spread General American since so many soldiers would have come from the (Mid-)West, the greatest proportion of population in the country—larger than the Southern, Mid-Atlantic, and Northeastern populations combined. It might be similar to how a distinct Southern accent didn’t exist until the Civil War when Southern soldiers fought together and came to share a common identity. Edward Murrow, of course, played a role as the manly voice of WWII describing firsthand accounts of fighting and bombings to the American public back at home.

Whether or not it deserves this prominent position, I suspect General American dialect is here to stay. To most people of this country and around the world, this dialect represents American society. It has become not just dominant here but in most places where English is spoken.

GA has even come to be promoted in the non-entertainment media of the British Broadcasting Company (BBC), specifically for news shows directed at the non-British, as the BBC reaches an international audience. Hollywood has, of course, spread GA English to other countries. So have video games, as the largest consumers of this product are Americans, which creates a bias in the entire industry. More English-speakers in the world have a GA dialect than any other dialect.

General American has become the unofficial standard of English almost everywhere. It is the English dialect that most people can easily understand and not recognize as being a dialect.

Frrrreeeeeddoommmm?????

Jared Dillian wrote an article simply titled, Frrrreeeeeddoommmm. I think we are supposed to imagine the title being screamed by Mel Gibson as his Braveheart character, William Wallace, is tortured to death. The author compares two states, concluding that he prefers ‘freedom’:

“If you want someone from Connecticut to get all riled up, drive extra slow in the passing lane. Connecticutians are very particular about that. The right lane is for traveling, the left lane is for passing. If you’re in the left lane for any other reason than passing, you are a jerk.

“So if you really want to ruin someone’s day, drive in the left lane at about 50 miles per hour. They will be grumpy for three days straight, I assure you.

“I was telling this story to one of my South Carolina friends—how upset people from Connecticut get about this, and how people from South Carolina basically drive however the hell they want—and he said ruefully, “Freedom…”

“He’s a guy who perhaps likes lots of rules to organize society, and perhaps he’d rather live in a world where some law governs how you conduct yourself in every aspect of your life, including how you drive. I tell you what, after growing up in Connecticut and then spending the last six years in the South, I’m enjoying the freedom, even if it means I occasionally get stuck behind some idiot.”

Here is my response. Mine isn’t exactly a contrarian view. Rather, it’s more of a complexifying view.

I take seriously the freedom to act, even when others think it’s wrong, depending of course on other factors. But there is no such thing as absolute freedom, just trade-offs made between benefits and costs. There are always constraints upon our choices and, as social animals, most constraints involve a social element, whether or not laws are involved.

Freedom is complex. Freedom from what and/or toward what?

The driving example is perfect. Connecticut has one of the lowest rates of car accidents and fatalities in the country. And South Carolina has one of the highest. Comparing the most dangerous driving state to the safest, a driver is 10 times more likely to die in an accident.

Freedom from death is no small freedom. Yet there is more to life than just freedom from death. Authoritarian countries like Singapore probably have low car accident rates and fatalities, but I’d rather not live in an authoritarian country.

There needs to be a balance of freedoms. There is an individual’s freedom to act. And then there is the freedom to not suffer the consequences of the actions of others. There is nothing free in externalized costs or, to put it another way, all costs must be paid by someone. It’s related to the free rider problem and moral hazard.

That is supposed to be the purpose of well designed (i.e., fair and just) political, legal, and economic systems. Freedom doesn’t just happen. A free society is a creation of choices made by many people over many generations. Every law passed does have unintended consequences. But, then again, every law repealed or never passed in the first place also has unintended consequences. There is no escaping unintended consequences.

There is also a cultural component to this. Southern states like South Carolina have a different kind of culture than Northern states like Connecticut. Comparing the two regions, the South is accident prone in general with higher rates of not just car accidents but also such things as gun accidents. In the North, even in states with high gun ownership, there tends to be lower rates of gun accidents.

In Connecticut or Iowa, it’s not just lower rate of dying in accidents (car, gun, etc). These kinds of states have lower mortality rates in general and hence on average longer lifespans. Maybe it isn’t the different kinds of laws that are the significant causal factor. Instead, maybe it’s the cultural attitude that leads both to particular laws and particular behaviors. The laws don’t make Connecticut drivers more safe. It’s simply that safety-conscious Connecticut drivers want such laws, but they’d likely drive safer even without such laws.

I’m not sure ‘freedom’ is a central issue in examples like this. I doubt Connecticutians feel less free for having safer roads and more orderly driving behavior. It’s probably what they want. They are probably just valuing and emphasizing different freedoms than South Carolinians.

There is the popular saying that your freedom ends at my nose. Even that leaves much room for interpretation. If your factory is polluting the air I breathe, your freedom to pollute has fully entered not only my nose but also my lungs and bloodstream.

It’s no mere coincidence that states with high accident rates also tend to have high pollution rates. And no mere coinicidence that states with low accident rates tend to have low pollution rates. These are the kinds of factors that contribute to the disparity of mortality rates.

It also has to do with attitudes toward human life. The South, with its history of slavery, seems to view life as being cheap. Worker accident rates are also higher in the South. All of this does have to do with laws, regulations, and unionization (and laws that make union organization easier or harder). But that leaves the question of why life is perceived differently in some places. Why are Southerners more cavalier about life and death? And why do they explain this cavalier attitude as being an expression of liberty?

To many Northerners, this cavalier attitude would be perceived quite differently. It wouldn’t be placed in the frame of ‘liberty’. This relates to the North literally not being part of the Cavalier culture that became the mythos of the South. The Cavaliers fought on the losing side of the English Civil War and many of them escaped to Virginia where they helped establish a particular culture that was later embraced by many Southerners who never descended from Cavaliers*.

Cavalier culture was based on honor culture. It included, for example, dueling and violent fighting. Men had to prove themselves. Recent research shows that Southerners are still more aggressive today, compared to Northerners. This probably relates to higher rates of road rage and, of course, car accidents.

Our culture doesn’t just encourage or discourage freedom. It more importantly shapes our view of freedom.

(*The apparent origin of Dillian’s article is a bit ironic. William Wallace fought against England which was still ruled by a Norman king, which is to say ruled by those whose descendants would later be called Cavaliers in their defense of the king against the Roundheads. The French Normans had introduced such fine traditions as monarchy, aristocracy, and feudalism. But they also introduced a particular variety of honor culture that was based on class and caste, the very same tradition that became the partly fictionalized origin story of Southern culture.)

Common Sense of the Common People

There are all kinds of rationalizations favored by the wealthy and powerful. They claim most Americans are too stupid, ignorant, inexperienced, lazy, or whatever. The basic assumption is that the dirty masses are generally inferior—blame it on genetics, culture, failing communities, and broken families; just don’t blame it on the political and economic system.

The upper classes argue that we live in a meritocracy and that we need a wise paternalistic ruling elite to deal with all of the complicated issues. Average Americans just don’t understand and would mess everything up, if they were given power to make or influence decisions about public policy. In fact, the public gets blamed for all the problems, as they supposedly get the government they deserve, based on the false belief that we have a functioning democracy that represents the public.

Anyone who is paying attention at all knows this paternalistic condescension is built on empty rhetoric, on lies and spin, even if many of the wealthy and powerful have sincerely come to believe their own propaganda. Most wealth in the US inherited, not earned. This isn’t even close to being a meritocracy. We are ruled by cronyism and corruption, both in government and the economy, partly because big gov and big biz have become so intertwined as to be inseparable.

The poor and young don’t have high rates of unemployment and underemployment because they are lazy, as there simply are fewer legal jobs available and the illegal jobs on the black market have the risk of landing you in prison. Even those fortunate to be in an area with high employment often end up working long hours and multiple jobs, as wages have stagnated and buying power fallen.

The poor, in particular, work harder on a regular basis than most rich have ever done in their entire lives. Illegal prostitutes, drug dealers, back alley car mechanics, cash-paid yard workers, undocumented immigrant farm labor, etc work harder than the rich could ever imagine, whether or not these illegal forms of work are officially included as part of the economy.

Research shows that the poor and young have some better basic financial skills than do the wealthy and the older. The poor are less likely to fall into cognitive biases that would cause them to lose money, for the simple reason they can’t afford to be careless with what little they have. And Millennials save a higher percentage of their earnings than do Boomers.

This is even true for complicated issues like the budget. There was a survey I saw a while back that showed the average American was better at balancing the budget than was the average politician. It’s probably not that politicians are intellectually incapable of comprehending financial issues. They simply are bought by special interests, unlike the average American. Balancing the budget isn’t rocket science.

We don’t live in a fair and just society. And we aren’t ruled by wise ruling elites and rational technocrats. It is simply wealth perpetuating wealth, power protecting power. The results of this are far from optimal. But imagine what could be accomplished if we had a fully functioning democracy.

* * *

Millennials Are Better Than Baby Boomers at Saving Money, Survey Shows
By Sean Williams, The Motley Fool

Millennials Are Outpacing Everyone in Retirement Savings
By Alexandra Mondalek, Time

Why the poor do better on these simple tests of financial common sense
By Max Ehrenfreund, Washington Post

Hey Congress, College Students Can Balance the Budget
By Jeffrey Dorfman, Real Clear Markets

Whose Work Counts? Who Gets Counted?

Working Hard, But For What?

To Be Poor, To Be Black, To Be Poor and Black

Structural Racism and Personal Responsibility

Partisanship vs Democracy

Here is a major problem of parties in the US political system.

We have a winner take all system. It creates a mindset of win at any costs and dominate by any means. Other (better functioning, one might argue) democratic governments allow greater multiple party competition while simultaneously encouraging cooperation and alliance-forming among the parties. One would hope that this could lessen the tendency of American-style territorial partisanship with its Social Darwinism and groupthink, and so it might give democracy a fighting chance to achieve democratic results.

The US system has come to operate with parties winning by excluding rather than including the most voters. Republicans dismiss minorities and non-Christians. Democrats dismiss lower class whites and leftist reformers. Both parties, in a bipartisan stranglehold, dismiss independents, third partiers, discouraged non-voters, and the disenfranchised.

Neither party fights for the rights, values, and interests of most of the American population. Instead, the two parties defend their turf and seek only to represent their respective small band of loyal followers. But in the end, the parties take their own partisans for granted, as the opinions of partisans is irrelevant since they’ll vote party line no matter what. This leaves both parties to do the bidding of big money, as research shows they do.

There is no incentive for either of the two main parties to act more democratically, to promote democracy, and to democratically represent the American people. Because of this, elections become empty spectacles to create an illusion of democratic process and consent of the governed.

We should and need to do better than this. But in order for that to happen, we have to demand better and put the force of conviction behind it. Otherwise, the force of desperation might lead us in directions we’d rather not go. Reform is the only way to prevent revolution. And, once revolution begins, it can’t be controlled.

My thoughts here aren’t particularly radical. They go back the American founding generation. It’s part of the original intent of the Constitution. George Washington, in particular, warned against the dangers of political parties. Washington may have been too idealistic in his criticisms, as factionalism may be an inevitable part of any large diverse society. Even if that is the case, his warnings about political parties remain and have proven true.

Fortunately, we don’t have to speculate about alternatives. The world is full of countries with different systems that can be studied and compared. Not all factionalism leads to American-style dysfunction and corruption. There is no reason we should remain attached to our dysfunction and corruption simply because it is our own. Think of it as a cancer needing to be removed so that the rest of the body may live.

If we don’t like the results we are getting and obviously most Americans are dissatisfied to an extreme degree with the status quo, then why not try something different in seeking different results? We Americans owe no loyalty to the two-party establishment. Our primary loyalty belongs to our country and and our fellow citizens, which is to say “We the People.” Democracy doesn’t mean power of parties and politicians; rather, it literally is power of the people.

Democracy?

I’ve previously written about stolen elections. The first election I voted in, 2000, happened to be the most blatant stolen election in US history. It went to the highest levels of power, involving a pivotal state governed by the brother of a major candidate and a partisan Supreme Court that decided to bypass democracy itself in order to declare the new ruler.

I don’t know what to make of it all. It really is messed up. Just another thing to make me despair. And heading into the new century was a time of my life when I didn’t need more despair.

It was my mid-twenties. Depression had hit me like a ton of bricks starting in my late teens. Leaving home for the first time, I was a lost cause and a lost soul. I dropped out of college and wandered aimlessly for a number of years, having endlessly contemplated suicide and one time attempted it. I eventually settled down, having permanently returned to my childhood home. At that point, I was in a slightly better frame of mind.

The turn of the century got everyone excited, with threats of the Y2K bug. It was a new century and a new millennium. We survived that with a sigh of relief, but the worst was yet to come. The coming decade of the aughts would not be a happy time. Even so, many looked to the new millennium with optimism, the Cold War having ended more than a decade before and the intervening years having seen a tech boom. The threat of terrorists and economic recessions weren’t yet on many people’s minds. The future seemed bright and ripe for change.

I remember that moment in time. I heard Nader give a speech on his presidential campaign. He gave me hope, as naive as that may sound. I can’t explain what an amazing thing hope can be when it has been lost for so long. Listening to Nader, it was beyond refreshing. It was inspiring. He was a politician who actually gave a damn. And the cynical partisan Democrats attacked the likes of me for voting my conscience, a silly thing to do considering that I wasn’t a Democrat and neither were most of Nader’s supporters, but that is always how partisan politics trumps all else, even democracy itself.

Following the Florida fiasco, the strangest thing in the world happened. Democrats rationalized it away, as their candidate rolled over and played dead (Kerry in 2004 followed Gore’s example, handing Bush a second term). The fullest recounts ever done showed that Gore won Florida (even more troubling developments happened in 2004), but no one wanted to know, especially not Democrats. To know the truth would mean having to admit the dark reality before us. And here we are still afraid of the truth.

Maybe there were good reasons for that fear. The powers that be were nothing to sniff at. I was reminded of this in coming across Clint Curtis’ allegations about vote rigging. What really caught my attention was the ‘suicide’ of an investigator, Raymond Lemme, who supposedly was about to bring info out to the public. There was also the suspicious death of a high-level Republican consultant, Michael Connell, after having been subpoenaed in a vote rigging investigation.

I don’t know what to do with this kind of thing. To most people, this is the territory of conspiracy theorists, ya know crazy paranoiacs. It should, therefore, be dismissed from thought and banished from public debate. The problem is that I’m psychologically incapable of ignoring inconvenient and uncomfortable facts. Call it depressive realism. I just can’t turn away, as if it doesn’t matter.

The whole thing is highly plausible, even though proving specific connections is difficult. I do know that a lot of unusual activity happened in the 2000 and 2004 elections. All of this comes back to mind during this campaign season, watching all the strange things going on with the Democratic caucuses and primaries: voters being purged, voter status being mysteriously switched, exit polls not matching voting results, etc.

The failure of our system isn’t necessarily what can be proved. Rather, it’s what can’t be proved that is problematic. Our present system is designed to lack transparency and accountability, to leave few if any paper trails and any other traceable evidence. I’d be glad if we could simply verify nothing illegal or immoral happened, nothing anti-democratic was involved, but that is precisely what we can’t do. The one thing democracy can’t overcome is secrecy, as that makes corruption inevitable.

I can’t help thinking that future generations will remember the beginning of this century as one of the darkest times in American history. It will be known as the era when the enemy within became more dangerous than any foreign power.

If you are one of the rare courageous individuals who wants to know what is going on in the world, then read Democracy Undone by Dale Tavris or one of the many other books about the topic. Or if you’d rather not read an entire book, you can find some info in the videos and links below. Your mind will be blown, your heart broken, and your sense of justice outraged—the proper attitude of any freedom-loving American.

This leaves us all with one question: If we don’t have a functioning democracy, what kind of country is this? Don’t just pass over that question. Let it sink in. Let yourself feel despair, to mourn what has been lost. Stop for a moment and consider what this all means. Look at what is before you with eyes wide open.

* * *

Did Expert Witness, Activists Thwart a Rove Ohio Vote Plot?
by Andrew Kreig

Who’s Stealing Your Vote? A Documentary
by John Wellington Ennis

How to Rig an Election
by Victoria Collier

How the GOP Wired Ohio’s 2004 Vote Count for Bush to Win
by Steven Rosenfeld

New Court Filing Reveals How the 2004 Ohio Presidential Election Was Hacked
by Bob Fritakis

New Evidence Of Vote Hacking Emerges In Ohio 2004 General Election Lawsuit
by Karoli Kuns

Why Was Uncertified ‘Experimental’ Software Installed on ES&S Tabulation Systems in 39 OH Counties Just Days Before Presidential Election?
By Brad Friedman

 

Clint Curtis
Wikipedia

Tom Feeney: Clint Curtis and vote fraud
SourceWatch

Michael Connell
Wikipedia

Mike Connell
SourceWatch

Programmers weigh in on vote-rigging idea, some details confirmed
by John Byrne

Death of Democracy
by Brad Friedman

Clint Curtis Investigator’s ‘Suicide’ Case Reopened By Georgia Police!
by Brad Friedman

The ghost of rigged elections past: New revelations on the death of Michael Connell
by Bob Fitrakis

These People Kill People You Know
by zapdam

Suspicious Deaths of Those Who Knew Too Much Under Bush’s Watch
by Diana Lee

You will know them by the trail of dead
Xymphora

Investigator’s Murder Cover-Up Straw That Broke Plot
by John Caylor

Global Eye
By Chris Floyd

 

 

 

Non-Identifying Environmentalists And Liberals

According to Gallup, the percentage of Americans identifying as environmentalists is about half of what it was a quarter century ago, when I was a young teenager. Yet the other polls show that Americans are more concerned with environmental issues than ever before.

This is similar to how fewer Americans identify as liberal precisely during this time when polls showing majority of Americans hold liberal positions on diverse issues. Older labels have lost their former meaning. They no longer resonate.

It isn’t as if Americans are becoming anti-environmentalist conservatives. Quite the opposite. It’s just that an increasing number of Americans, when given a choice, would rather identify as progressive, moderate, independent, or even socialist. In fact, the socialist label gets more favorable opinion than the Tea Party label, although libertarianism is gaining favor.

Young Americans are the most liberal of any age demographic, in terms of their politics. They are more liberal than even the supposed liberal class, despite the young not self-identifying as liberal. They are so liberal as to be leaning leftist.

Conservatives are mistaken when they put too much stock in ideological labels and too little stock in substance of views. Their confusion is understandable. Many pollsters have had a hard time keeping up with changing labels, not initially realizing they needed to offer choices beyond the standard binary of liberal or conservative.

Not all of this can be blamed on pollsters, though. There was enough polling data to show major shifts were afoot. Some pollsters were able to discern that Millennials had a majority positive opinion of the ‘socialism’. That interesting fact of public opinion began showing up about a decade ago, but apparently few in the mainstream were paying attention until Sanders’ candidacy came along.

The older generations are shocked. As children of Cold War propaganda, they unsurprisingly have a knee jerk reaction to the word ‘socialism’. More interesting is that these older Americans also dislike libertarianism. For the young, socialism and libertarianism are two expressions of their growing extremes of liberal-mindedness.

So, it’s more of a divide of generations than of ideology.

Central to this are environmental concerns. Most older Americans probably assume they will die before major environmental catastrophes happen, allowing them to shut these problems out of their minds and pretend they aren’t fully real. Younger Americans, on the other hand, realize they’ll be forced to deal with these problems they’re inheriting.

* * *

Americans’ Identification as “Environmentalists” Down to 42%

Americans’ Concerns About Water Pollution Edge Up

U.S. Concern About Global Warming at Eight-Year High

For First Time, Majority in U.S. Oppose Nuclear Energy

Opposition to Fracking Mounts in the U.S.

In U.S., 73% Now Prioritize Alternative Energy Over Oil, Gas

Social Disorder, Mental Disorder

“It is no measure of health to be well adjusted to a profoundly sick society.”
~ Jiddu Krishnamurti

“The opposite of addiction is not sobriety. The opposite of addiction is connection.”
~ Johann Harri

On Staying Sane in a Suicidal Culture
by Dahr Jamail

Our situation so often feels hopeless. So much has spun out of control, and pathology surrounds us. At least one in five Americans are taking psychiatric medications, and the number of children taking adult psychiatric drugs is soaring.

From the perspective of Macy’s teachings, it seems hard to argue that this isn’t, at least in part, active denial of what is happening to the world and how challenging it is for both adults and children to deal with it emotionally, spiritually and psychologically.

These disturbing trends, which are increasing, are something she is very mindful of. As she wrote in World as Lover, World as Self, “The loss of certainty that there will be a future is, I believe, the pivotal psychological reality of our time.”

What does depression feel like? Trust me – you really don’t want to know
by Tim Lott

Admittedly, severely depressed people can connect only tenuously with reality, but repeated studies have shown that mild to moderate depressives have a more realistic take on life than most “normal” people, a phenomenon known as “depressive realism”. As Neel Burton, author of The Meaning of Madness, put it, this is “the healthy suspicion that modern life has no meaning and that modern society is absurd and alienating”. In a goal-driven, work-oriented culture, this is deeply threatening.

This viewpoint can have a paralysing grip on depressives, sometimes to a psychotic extent – but perhaps it haunts everyone. And therefore the bulk of the unafflicted population may never really understand depression. Not only because they (understandably) lack the imagination, and (unforgivably) fail to trust in the experience of the sufferer – but because, when push comes to shove, they don’t want to understand. It’s just too … well, depressing.

The Mental Disease of Late-Stage Capitalism
by Joe Brewer

A great irony of this deeply corrupt system of wealth hoarding is that the “weapon of choice” is how we feel about ourselves as we interact with our friends. The elites don’t have to silence us. We do that ourselves by refusing to talk about what is happening to us. Fake it until you make it. That’s the advice we are given by the already successful who have pigeon-holed themselves into the tiny number of real opportunities society had to offer. Hold yourself accountable for the crushing political system that was designed to divide us against ourselves.

This great lie that we whisper to ourselves is how they control us. Our fear that other impoverished people (which is most of us now) will look down on us for being impoverished too. This is how we give them the power to keep humiliating us.

I say no more of this emotional racket. If I am going to be responsible for my fate in life, let it be because I chose to stand up and fight — that I helped dismantle the global architecture of wealth extraction that created this systemic corruption of our economic and political systems.

Now more than ever, we need spiritual healing. As this capitalist system destroys itself, we can step aside and find healing by living honestly and without fear. They don’t get to tell us how to live. We can share our pain with family and friends. We can post it on social media. Shout it from the rooftops if we feel like it. The pain we feel is capitalism dying. It hurts us because we are still in it.

Neoliberalism – the ideology at the root of all our problems
by George Monbiot

So pervasive has neoliberalism become that we seldom even recognise it as an ideology. We appear to accept the proposition that this utopian, millenarian faith describes a neutral force; a kind of biological law, like Darwin’s theory of evolution. But the philosophy arose as a conscious attempt to reshape human life and shift the locus of power.

Neoliberalism sees competition as the defining characteristic of human relations. It redefines citizens as consumers, whose democratic choices are best exercised by buying and selling, a process that rewards merit and punishes inefficiency. It maintains that “the market” delivers benefits that could never be achieved by planning.

Attempts to limit competition are treated as inimical to liberty. Tax and regulation should be minimised, public services should be privatised. The organisation of labour and collective bargaining by trade unions are portrayed as market distortions that impede the formation of a natural hierarchy of winners and losers. Inequality is recast as virtuous: a reward for utility and a generator of wealth, which trickles down to enrich everyone. Efforts to create a more equal society are both counterproductive and morally corrosive. The market ensures that everyone gets what they deserve.

We internalise and reproduce its creeds. The rich persuade themselves that they acquired their wealth through merit, ignoring the advantages – such as education, inheritance and class – that may have helped to secure it. The poor begin to blame themselves for their failures, even when they can do little to change their circumstances.

Never mind structural unemployment: if you don’t have a job it’s because you are unenterprising. Never mind the impossible costs of housing: if your credit card is maxed out, you’re feckless and improvident. Never mind that your children no longer have a school playing field: if they get fat, it’s your fault. In a world governed by competition, those who fall behind become defined and self-defined as losers.

Among the results, as Paul Verhaeghe documents in his book What About Me? are epidemics of self-harm, eating disorders, depression, loneliness, performance anxiety and social phobia. Perhaps it’s unsurprising that Britain, in which neoliberal ideology has been most rigorously applied, is the loneliness capital of Europe. We are all neoliberals now.

Neoliberalism has brought out the worst in us
by Paul Verhaeghe

We tend to perceive our identities as stable and largely separate from outside forces. But over decades of research and therapeutic practice, I have become convinced that economic change is having a profound effect not only on our values but also on our personalities. Thirty years of neoliberalism, free-market forces and privatisation have taken their toll, as relentless pressure to achieve has become normative. If you’re reading this sceptically, I put this simple statement to you: meritocratic neoliberalism favours certain personality traits and penalises others.

There are certain ideal characteristics needed to make a career today. The first is articulateness, the aim being to win over as many people as possible. Contact can be superficial, but since this applies to most human interaction nowadays, this won’t really be noticed.

It’s important to be able to talk up your own capacities as much as you can – you know a lot of people, you’ve got plenty of experience under your belt and you recently completed a major project. Later, people will find out that this was mostly hot air, but the fact that they were initially fooled is down to another personality trait: you can lie convincingly and feel little guilt. That’s why you never take responsibility for your own behaviour.

On top of all this, you are flexible and impulsive, always on the lookout for new stimuli and challenges. In practice, this leads to risky behaviour, but never mind, it won’t be you who has to pick up the pieces. The source of inspiration for this list? The psychopathy checklist by Robert Hare, the best-known specialist on psychopathy today.

What About Me?: The Struggle for Identity in a Market-Based Society
by Paul Verhaeghe
Kindle Locations 2357-2428

Hypotheses such as these, however plausible, are not scientific. If we want to demonstrate the link between a neo-liberal society and, say, mental disorders, we need two things. First, we need a yardstick that indicates the extent to which a society is neo-liberal. Second, we need to develop criteria to measure the increase or decrease of psychosocial wellbeing in society. Combine these two, and you would indeed be able to see whether such a connection existed. And by that I don’t mean a causal connection, but a striking pattern; a rise in one being reflected in the other, or vice versa.

This was exactly the approach used by Richard Wilkinson, a British social epidemiologist, in two pioneering studies (the second carried out with Kate Pickett). The gauge they used was eminently quantifiable: the extent of income inequality within individual countries. This is indeed a good yardstick, as neo-liberal policy is known to cause a spectacular rise in such inequality. Their findings were unequivocal: an increase of this kind has far-reaching consequences for nearly all health criteria. Its impact on mental health (and consequently also mental disorders) is by no means an isolated phenomenon. This finding is just as significant as the discovery that mental disorders are increasing.

As social epidemiologists, Wilkinson and Pickett studied the connection between society and health in the broad sense of the word. Stress proves to be a key factor here. Research has revealed its impact, both on our immune systems and our cardiovascular systems. Tracing the causes of stress is difficult, though, especially given that we live in the prosperous and peaceful West. If we take a somewhat broader view, most academics agree on the five factors that determine our health: early childhood; the fears and cares we experience; the quality of our social relationships; the extent to which we have control over our lives; and, finally, our social status. The worse you score in these areas, the worse your health and the shorter your life expectancy are likely to be.

In his first book, The Impact of Inequality: how to make sick societies healthier, Wilkinson scrutinises the various factors involved, rapidly coming to what would be the central theme of his second book — that is, income inequality. A very striking conclusion is that in a country, or even a city, with high income inequality, the quality of social relationships is noticeably diminished: there is more aggression, less trust, more fear, and less participation in the life of the community. As a psychoanalyst, I was particularly interested in his quest for the factors that play a role at individual level. Low social status proves to have a determining effect on health. Lack of control over one’s work is a prominent stress factor. A low sense of control is associated with poor relationships with colleagues and greater anger and hostility — a phenomenon that Richard Sennett had already described (the infantilisation of adult workers). Wilkinson discovered that this all has a clear impact on health, and even on life expectancy. Which in turn ties in with a classic finding of clinical psychology: powerlessness and helplessness are among the most toxic emotions.

Too much inequality is bad for your health

A number of conclusions are forced upon us. In a prosperous part of the world like Western Europe, it isn’t the quality of health care (the number of doctors and hospitals) that determines the health of the population, but the nature of social and economic life. The better social relationships are, the better the level of health. Excessive inequality is more injurious to health than any other factor, though this is not simply a question of differences between social classes. If anything, it seems to be more of a problem within groups that are presumed to be equal (for example, civil servants and academics). This finding conflicts with the general assumption that income inequality only hurts the underclass — the losers — while those higher up the social ladder invariably benefit. That’s not the case: its negative effects are statistically visible in all sectors of the population, hence the subtitle of Wilkinson’s second work: why more equal societies almost always do better.

In that book, Wilkinson and Pickett adopt a fairly simple approach. Using official statistics, they analyse the connection between income inequality and a host of other criteria. The conclusions are astounding, almost leaping off the page in table after table: the greater the level of inequality in a country or even region, the more mental disorders, teenage pregnancies, child mortality, domestic and street violence, crime, drug abuse, and medication. And the greater the inequality is, the worse physical health and educational performance are, the more social mobility declines, along with feelings of security, and the unhappier people are.

Both books, especially the latter, provoked quite a response in the Anglo-Saxon world. Many saw in them proof of what they already suspected. Many others were more negative, questioning everything from the collation of data to the statistical methods used to reach conclusions. Both authors refuted the bulk of the criticism — which, given the quality of their work, was not a very difficult task. Much of it targeted what was not in the books: the authors were not urging a return to some kind of ‘all animals are equal’ Eastern-bloc state. What critics tended to forget was that their analysis was of relative differences in income, with negative effects becoming most manifest in the case of extreme inequality. Moreover, it is not income inequality itself that produces these effects, but the stress factors associated with it.

Roughly the same inferences can be drawn from Sennett’s study, though it is more theoretical and less underpinned with figures. His conclusion is fairly simple, and can be summed up in the title of what I regard as his best book: Respect in a World of Inequality. Too much inequality leads to a loss of respect, including self-respect — and, in psychosocial terms, this is about the worst thing that can happen to anyone.

This emerges very powerfully from a single study of the social determinants of health, which is still in progress. Nineteen eighty-six saw the start of the second ‘Whitehall Study’ that systematically monitored over 10,000 British civil servants, to establish whether there was a link between their health and their work situations. At first sight, this would seem to be a relatively homogenous group, and one that definitely did not fall in the lowest social class. The study’s most striking finding is that the lower the rank and status of someone within that group, the lower their life expectancy, even when taking account of such factors as smoking, diet, and physical exercise. The most obvious explanation is that the lowest-ranked people experienced the most stress. Medical studies confirm this: individuals in this category have higher cortisol levels (increased stress) and more coagulation-factor deficiencies (and thus are at greater risk of heart attacks).

My initial question was, ‘Is there a demonstrable connection between today’s society and the huge rise in mental disorders?’ As all these studies show, the answer is yes. Even more important is the finding that this link goes beyond mental health. The same studies show highly negative effects on other health parameters. As so often is the case, a parallel can be found in fiction — in this instance, in Alan Lightman’s novel The Diagnosis. During an interview, the author posed the following rhetorical question: ‘Who, experiencing for years the daily toll of intense corporate pressure, could truly escape severe anxiety?’* (I think it may justifiably be called rhetorical, when you think how many have had to find out its answer for themselves.)

A study by a research group at Heidelberg University very recently came to similar conclusions, finding that people’s brains respond differently to stress according to whether they have had an urban or rural upbringing. 3 What’s more, people in the former category prove more susceptible to phobias and even schizophrenia. So our brains are differently shaped by the environment in which we grow up, making us potentially more susceptible to mental disorders. Another interesting finding emerged from the way the researchers elicited stress. While the subjects of the experiment were wrestling with the complex calculations they had been asked to solve, some of them were told (falsely) that their scores were lagging behind those of the others, and asked to hurry up because the experiments were expensive. All the neo-liberal factors were in place: emphasis on productivity, evaluation, competition, and cost reduction.

Capitalist Realism: Is there no alternative?
by Mark Fisher
pp. 19-22

Mental health, in fact, is a paradigm case of how capitalist realism operates. Capitalist realism insists on treating mental health as if it were a natural fact, like weather (but, then again, weather is no longer a natural fact so much as a political-economic effect). In the 1960s and 1970s, radical theory and politics (Laing, Foucault, Deleuze and Guattari, etc.) coalesced around extreme mental conditions such as schizophrenia, arguing, for instance, that madness was not a natural, but a political, category. But what is needed now is a politicization of much more common disorders. Indeed, it is their very commonness which is the issue: in Britain, depression is now the condition that is most treated by the NHS. In his book The Selfish Capitalist, Oliver James has convincingly posited a correlation between rising rates of mental distress and the neoliberal mode of capitalism practiced in countries like Britain, the USA and Australia. In line with James’s claims, I want to argue that it is necessary to reframe the growing problem of stress (and distress) in capitalist societies. Instead of treating it as incumbent on individuals to resolve their own psychological distress, instead, that is, of accepting the vast privatization of stress that has taken place over the last thirty years, we need to ask: how has it become acceptable that so many people, and especially so many young people, are ill? The ‘mental health plague’ in capitalist societies would suggest that, instead of being the only social system that works, capitalism is inherently dysfunctional, and that the cost of it appearing to work is very high. […]

By contrast with their forebears in the 1960s and 1970s, British students today appear to be politically disengaged. While French students can still be found on the streets protesting against neoliberalism, British students, whose situation is incomparably worse, seem resigned to their fate. But this, I want to argue, is a matter not of apathy, nor of cynicism, but of reflexive impotence. They know things are bad, but more than that, they know they can’t do anything about it. But that ‘knowledge’, that reflexivity, is not a passive observation of an already existing state of affairs. It is a self-fulfilling prophecy.

Reflexive impotence amounts to an unstated worldview amongst the British young, and it has its correlate in widespread pathologies. Many of the teenagers I worked with had mental health problems or learning difficulties. Depression is endemic. It is the condition most dealt with by the National Health Service, and is afflicting people at increasingly younger ages. The number of students who have some variant of dyslexia is astonishing. It is not an exaggeration to say that being a teenager in late capitalist Britain is now close to being reclassified as a sickness. This pathologization already forecloses any possibility of politicization. By privatizing these problems – treating them as if they were caused only by chemical imbalances in the individual’s neurology and/ or by their family background – any question of social systemic causation is ruled out.

Many of the teenage students I encountered seemed to be in a state of what I would call depressive hedonia. Depression is usually characterized as a state of anhedonia, but the condition I’m referring to is constituted not by an inability to get pleasure so much as it by an inability to do anything else except pursue pleasure. There is a sense that ‘something is missing’ – but no appreciation that this mysterious, missing enjoyment can only be accessed beyond the pleasure principle. In large part this is a consequence of students’ ambiguous structural position, stranded between their old role as subjects of disciplinary institutions and their new status as consumers of services. In his crucial essay ‘Postscript on Societies of Control’, Deleuze distinguishes between the disciplinary societies described by Foucault, which were organized around the enclosed spaces of the factory, the school and the prison, and the new control societies, in which all institutions are embedded in a dispersed corporation.

pp. 32-38

The ethos espoused by McCauley is the one which Richard Sennett examines in The Corrosion of Character: The Personal Consequences of Work in the New Capitalism, a landmark study of the affective changes that the post-Fordist reorganization of work has brought about. The slogan which sums up the new conditions is ‘no long term’. Where formerly workers could acquire a single set of skills and expect to progress upwards through a rigid organizational hierarchy, now they are required to periodically re-skill as they move from institution to institution, from role to role. As the organization of work is decentralized, with lateral networks replacing pyramidal hierarchies, a premium is put on ‘flexibility’. Echoing McCauley’s mockery of Hanna in Heat (‘ How do you expect to keep a marriage?’), Sennett emphasizes the intolerable stresses that these conditions of permanent instability put on family life. The values that family life depends upon – obligation, trustworthiness, commitment – are precisely those which are held to be obsolete in the new capitalism. Yet, with the public sphere under attack and the safety nets that a ‘Nanny State’ used to provide being dismantled, the family becomes an increasingly important place of respite from the pressures of a world in which instability is a constant. The situation of the family in post-Fordist capitalism is contradictory, in precisely the way that traditional Marxism expected: capitalism requires the family (as an essential means of reproducing and caring for labor power; as a salve for the psychic wounds inflicted by anarchic social-economic conditions), even as it undermines it (denying parents time with children, putting intolerable stress on couples as they become the exclusive source of affective consolation for each other). […]

The psychological conflict raging within individuals cannot but have casualties. Marazzi is researching the link between the increase in bi-polar disorder and post-Fordism and, if, as Deleuze and Guattari argue, schizophrenia is the condition that marks the outer edges of capitalism, then bi-polar disorder is the mental illness proper to the ‘interior’ of capitalism. With its ceaseless boom and bust cycles, capitalism is itself fundamentally and irreducibly bi-polar, periodically lurching between hyped-up mania (the irrational exuberance of ‘bubble thinking’) and depressive come-down. (The term ‘economic depression’ is no accident, of course). To a degree unprecedented in any other social system, capitalism both feeds on and reproduces the moods of populations. Without delirium and confidence, capital could not function.

It seems that with post-Fordism, the ‘invisible plague’ of psychiatric and affective disorders that has spread, silently and stealthily, since around 1750 (i.e. the very onset of industrial capitalism) has reached a new level of acuteness. Here, Oliver James’s work is important. In The Selfish Capitalist, James points to significant rises in the rates of ‘mental distress’ over the last 25 years. ‘By most criteria’, James reports,

rates of distress almost doubled between people born in 1946 (aged thirty-six in 1982) and 1970 (aged thirty in 2000). For example, 16 per cent of thirty-six-year-old women in 1982 reported having ‘trouble with nerves, feeling low, depressed or sad’, whereas 29 per cent of thirty year-olds reported this in 2000 (for men it was 8 per cent in 1982, 13 per cent in 2000).

Another British study James cites compared levels of psychiatric morbidity (which includes neurotic symptoms, phobias and depression) in samples of people in 1977 and 1985. ‘Whereas 22 per cent of the 1977 sample reported psychiatric morbidity, this had risen to almost a third of the population (31 per cent) by 1986’. Since these rates are much higher in countries that have implemented what James calls ‘selfish’ capitalism than in other capitalist nations, James hypothesizes that it is selfish (i.e. neoliberalized) capitalist policies and culture that are to blame. […]

James’s conjectures about aspirations, expectations and fantasy fit with my own observations of what I have called ‘hedonic depression’ in British youth.

It is telling, in this context of rising rates of mental illness, that New Labour committed itself, early in its third term in government, to removing people from Incapacity Benefit, implying that many, if not most, claimants are malingerers. In contrast with this assumption, it doesn’t seem unreasonable to infer that most of the people claiming Incapacity Benefit – and there are well in excess of two million of them – are casualties of Capital. A significant proportion of claimants, for instance, are people psychologically damaged as a consequence of the capitalist realist insistence that industries such as mining are no longer economically viable. (Even considered in brute economic terms, though, the arguments about ‘viability’ seem rather less than convincing, especially once you factor in the cost to taxpayers of incapacity and other benefits.) Many have simply buckled under the terrifyingly unstable conditions of post-Fordism.

The current ruling ontology denies any possibility of a social causation of mental illness. The chemico-biologization of mental illness is of course strictly commensurate with its de-politicization. Considering mental illness an individual chemico-biological problem has enormous benefits for capitalism. First, it reinforces Capital’s drive towards atomistic individualization (you are sick because of your brain chemistry). Second, it provides an enormously lucrative market in which multinational pharmaceutical companies can peddle their pharmaceuticals (we can cure you with our SSRIs). It goes without saying that all mental illnesses are neurologically instantiated, but this says nothing about their causation. If it is true, for instance, that depression is constituted by low serotonin levels, what still needs to be explained is why particular individuals have low levels of serotonin. This requires a social and political explanation; and the task of repoliticizing mental illness is an urgent one if the left wants to challenge capitalist realism.

It does not seem fanciful to see parallels between the rising incidence of mental distress and new patterns of assessing workers’ performance. We will now take a closer look at this ‘new bureaucracy’.

The Opposite of Addiction is Connection
by Robert Weiss LCSW, CSAT-S

Not for Alexander. He was bothered by the fact that the cages in which the rats were isolated were small, with no potential for stimulation beyond the heroin. Alexander thought: Of course they all got high. What else were they supposed to do? In response to this perceived shortcoming, Alexander created what we now call “the rat park,” a cage approximately 200 times larger than the typical isolation cage, with Hamster wheels and multi-colored balls to play with, plenty of tasty food to eat, and spaces for mating and raising litters.[ii] And he put not one rat, but 20 rats (of both genders) into the cage. Then, and only then, did he mirror the old experiments, offering one bottle of pure water and one bottle of heroin water. And guess what? The rats ignored the heroin. They were much more interested in typical communal rat activities such as playing, fighting, eating, and mating. Essentially, with a little bit of social stimulation and connection, addiction disappeared. Heck, even rats who’d previously been isolated and sucking on the heroin water left it alone once they were introduced to the rat park.

The Human Rat Park

One of the reasons that rats are routinely used in psychological experiments is that they are social creatures in many of the same ways that humans are social creatures. They need stimulation, company, play, drama, sex, and interaction to stay happy. Humans, however, add an extra layer to this equation. We need to be able to trust and to emotionally attach.

This human need for trust and attachment was initially studied and developed as a psychological construct in the 1950s, when John Bowlby tracked the reactions of small children when they were separated from their parents.[iii] In a nutshell, he found that infants, toddlers, and young children have an extensive need for safe and reliable caregivers. If children have that, they tend to be happy in childhood and well-adjusted (emotionally healthy) later in life. If children don’t have that, it’s a very different story. In other words, it is clear from Bowlby’s work and the work of later researchers that the level and caliber of trust and connection experienced in early childhood carries forth into adulthood. Those who experience secure attachment as infants, toddlers, and small children nearly always carry that with them into adulthood, and they are naturally able to trust and connect in healthy ways. Meanwhile, those who don’t experience secure early-life attachment tend to struggle with trust and connection later in life. In other words, securely attached individuals tend to feel comfortable in and to enjoy the human rat park, while insecurely attached people typically struggle to fit in and connect.

The Opposite Of Addiction is Connection
By Jonathan Davis

If connection is the opposite of addiction, then an examination of the neuroscience of human connection is in order. Published in 2000, A General Theory Of Love is a collaboration between three professors of psychiatry at the University of California in San Francisco. A General Theory Of Love reveals that humans require social connection for optimal brain development, and that babies cared for in a loving environment are psychological and neurologically ‘immunised’ by love. When things get difficult in adult life, the neural wiring developed from a love-filled childhood leads to increased emotional resilience in adult life. Conversely, those who grow up in an environment where loving care is unstable or absent are less likely to be resilient in the face of emotional distress.

How does this relate to addiction? Gabor Maté observes an extremely high rate of childhood trauma in the addicts he works with and trauma is the extreme opposite of growing up in a consistently safe and loving environment. He asserts that it is extremely common for people with addictions to have a reduced capacity for dealing with emotional distress, hence an increased risk of drug-dependence.

How Our Ability To Connect Is Impaired By Trauma

Trauma is well-known to cause interruption to healthy neural wiring, in both the developing and mature brain. A deeper issue here is that people who have suffered trauma, particularly children, can be left with an underlying sense that the world is no longer safe, or that people can no longer be trusted. This erosion (or complete destruction) of a sense of trust, that our family, community and society will keep us safe, results in isolation – leading to the very lack of connection Johann Harri suggests is the opposite of addiction. People who use drugs compulsively do so to avoid the pain of past trauma and to replace the absence of connection in their life.

Social Solutions To Addiction

The solution to the problem of addiction on a societal level is both simple and fairly easy to implement. If a person is born into a life that is lacking in love and support on a family level, or if due to some other trauma they have become isolated and suffer from addiction, there must be a cultural response to make sure that person knows that they are valued by their society (even if they don’t feel valued by their family). Portugal has demonstrated this with a 50% drop in addiction thanks to programs that are specifically designed to re-create connection between the addict and their community.

The real cause of addiction has been discovered – and it’s not what you think
by Johann Hari

This has huge implications for the one hundred year old war on drugs. This massive war – which, as I saw, kills people from the malls of Mexico to the streets of Liverpool – is based on the claim that we need to physically eradicate a whole array of chemicals because they hijack people’s brains and cause addiction. But if drugs aren’t the driver of addiction – if, in fact, it is disconnection that drives addiction – then this makes no sense.

Ironically, the war on drugs actually increases all those larger drivers of addiction: for example, I went to a prison in Arizona – ‘Tent City’ – where inmates are detained in tiny stone isolation cages (“The Hole”) for weeks and weeks on end, to punish them for drug use. It is as close to a human recreation of the cages that guaranteed deadly addiction in rats as I can imagine. And when those prisoners get out, they will be unemployable because of their criminal record – guaranteeing they with be cut off ever more. I watched this playing out in the human stories I met across the world.

There is an alternative. You can build a system that is designed to help drug addicts to reconnect with the world – and so leave behind their addictions.

This isn’t theoretical. It is happening. I have seen it. Nearly fifteen years ago, Portugal had one of the worst drug problems in Europe, with 1 percent of the population addicted to heroin. They had tried a drug war, and the problem just kept getting worse. So they decided to do something radically different. They resolved to decriminalize all drugs, and transfer all the money they used to spend on arresting and jailing drug addicts, and spend it instead on reconnecting them – to their own feelings, and to the wider society. The most crucial step is to get them secure housing, and subsidized jobs – so they have a purpose in life, and something to get out of bed for. I watched as they are helped, in warm and welcoming clinics, to learn how to reconnect with their feelings, after years of trauma and stunning them into silence with drugs.

One example I learned about was a group of addicts who were given a loan to set up a removals firm. Suddenly, they were a group, all bonded to each other, and to the society, and responsible for each other’s care.

The results of all this are now in. An independent study by the British Journal of Criminology found that since total decriminalization, addiction has fallen, and injecting drug use is down by 50 percent. I’ll repeat that: injecting drug use is down by 50 percent. Decriminalization has been such a manifest success that very few people in Portugal want to go back to the old system. The main campaigner against the decriminalization back in 2000 was Joao Figueira – the country’s top drug cop. He offered all the dire warnings that we would expect from the Daily Mail or Fox News. But when we sat together in Lisbon, he told me that everything he predicted had not come to pass – and he now hopes the whole world will follow Portugal’s example.

This isn’t only relevant to addicts. It is relevant to all of us, because it forces us to think differently about ourselves. Human beings are bonding animals. We need to connect and love. The wisest sentence of the twentieth century was E.M. Forster’s: “only connect.” But we have created an environment and a culture that cut us off from connection, or offer only the parody of it offered by the Internet. The rise of addiction is a symptom of a deeper sickness in the way we live–constantly directing our gaze towards the next shiny object we should buy, rather than the human beings all around us.

The writer George Monbiot has called this “the age of loneliness.” We have created human societies where it is easier for people to become cut off from all human connections than ever before. Bruce Alexander, the creator of Rat Park, told me that for too long, we have talked exclusively about individual recovery from addiction. We need now to talk about social recovery—how we all recover, together, from the sickness of isolation that is sinking on us like a thick fog.

But this new evidence isn’t just a challenge to us politically. It doesn’t just force us to change our minds. It forces us to change our hearts.

* * *

Social Conditions of an Individual’s Condition

Society and Dysfunction

It’s All Your Fault, You Fat Loser!

Liberal-mindedness, Empathetic Imagination, and Capitalist Realism

Ideological Realism & Scarcity of Imagination

The Unimagined: Capitalism and Crappiness

To Put the Rat Back in the Rat Park

Rationalizing the Rat Race, Imagining the Rat Park

The Desperate Acting Desperately

To Grow Up Fast

Morality-Punishment Link

An Invisible Debt Made Visible

Trends in Depression and Suicide Rates

From Bad to Worse: Trends Across Generations

Republicans: Party of Despair

Rate And Duration of Despair

Inequality Divides, Privilege Disconnects

Privilege is a tough subject. For most people, there are always plenty of others who are both more privileged and less privileged. Still, nuance and complexity isn’t how we tend to think about such things. It depends, as always, on what we focus upon and what we ignore—this typically being shaped by unconscious biases.

We don’t objectively compare ourselves to the larger social reality. And we don’t base our perceptions on intricate demographic data and comprehensive surveys. What we usually do is create a sense of our place in the world through personal anecdotes and vague media-filtered experience, through narrative frames and political rhetoric. This causes us to compare ourselves to the distorted and often fictionalized narratives portrayed in MSM news reporting and Hollywood movies—not to mention the influence of now near endless political campaigning and the subtle class war rhetoric that is drilled into our psyches. Besides that, it is human nature to focus on and, when possible, aspire toward what is above us. Even the wealthy will look with envy at the even wealthier. This is exaggerated in a high inequality society, where the gap between the rich and super-rich is as vast as the gap between the upper classes and all the rest, and such gaps continue to grow ever more vast. Only those near the bottom might bother to spend much time looking down upon those below them in the social pecking order, whether the differences are real (class) or perceived (race).

I’ve pointed out how this plays out for liberals—the privilege of the liberal class, the bias and benefits inherent to greater wealth and status, opportunities and resources. The liberal demographic is among the most economically well off and well educated. And, related to this, the wealthier of any demographic (race, ideology, etc) the more liberal people tend to be, often both in terms of social liberalism and classical liberal economics. It’s not only about those who self-identify as liberals. A similar pattern is found among libertarians and other right-wingers, from objectivists to anarcho-capitalists. It’s true of the Republican political elite and activists, the conservative pundits and think tank intellectuals, the business managerial class and inherited old wealth. But it’s also true of most people on the far political left: Marxists, anarcho-syndicalists, feminists, etc. Even the typical minority activist and politician is going to be far above average in wealth and education. To hold and defend any particular ideology or identity politics largely depends on a privileged status in society. It takes a lot of time, energy, and resources to commit to such activities—especially if one makes a career out of it. The poor, whether working or unemployed, whether white or minority, don’t have this kind of luxury.

There is an odd dynamic here. The middle-to-upper class are more ‘liberal’ in many ways, including for those on the political right. Those far down the economic scale are less concerned about defending liberalism in any of its forms, whether leftist standard liberalism or right-wing classical liberalism. In Western countries, even radical left-wingers who often are critical of ‘liberalism’ are more culturally liberal than the poor. On the other hand, the lower classes (i.e., the majority of the population) are more liberal/leftist in concrete ways than the political elite that claims to represent them—supporting: higher taxation of the rich and corporations, stronger social safety net, more effective regulations, less wars of aggression and military adventurism, etc. The supposed conservatism of the lower class majority is primarily symbolic, not necessarily based on specific political principles and policies. But it could be seen as genuinely conservative in the lower class’ demand for more emphasis on social capital and culture of trust, family and community—the very things that are undermined by upper class politics and economics, especially neoliberalism. Anyway, it’s a class divide more than an ideological divide, as the differences between partisan/ideological elites is negligible in terms of practical politics and actual results.

The main thing, anyway, is that there are these fundamental divides in our society. They lead to endless disconnections and conflicts. Our thoughts are distorted and our vision narrowed, causing endless confusion and misunderstanding. This is why privilege is so hard to see and understand. We rarely ever get the sense of the full context of our lives. This has worsened because of the segregation of not just ghettos, housing projects, and rural isolation but also of suburbs, walled communities, and gentrified neighborhoods. Physical distance leads to psychological distance.

Obviously, it’s not just about the hypocrisy of self-identified liberals living comfortable lives, even though their example is egregious based on the politics they outwardly support. This is a collective failure, not just of the dominant liberal order of post-Enlightenment society, or rather at this point we are all liberals complicit in this failure, even those who spend their lives complaining about liberalism and blaming liberals. Hypocritical liberals simply make explicit what is implicit to the world we live in. Even poor Westerners are part of the problems involved in a long history of imperialism, colonialism, genocide, slavery, exploitation, etc. Living in the West, we are all legacy beneficiaries of immense crimes against humanity in the past but also continuing into the present. This is particularly true of a country like the US, for being a subject of an empire has its advantages even for the poorest of subjects, not that it’s all that great of a fate even though there are worst fates.

As a liberal, much of my focus has been on other liberals. But I want to clarify this. There is a reason I identify as a liberal. It’s because of, not in spite of, my criticisms. According to my most utopian ideals and futuristic visions, I could identify as a left-libertarian, anarcho-syndicalist, democratic-socialist, etc. I could grab hold of some ideology as a way of distancing myself from liberalism. I don’t want to do that. Instead, I want to emphasize that I’m complicit in all that goes on in this society. I don’t want to merely stand back from it all or worse still pretend I’m above it all. That is what irritates me about many left-wingers. It’s true that left-wing politics has little overt power in the world today, certainly not in the United States. But I think it’s a cop-out for left-wingers to play intellectual games with detached righteousness, lost in their highfalutin abstruse historical and economic analyses, as if they aren’t stuck down in the sewers covered with shit like the rest of us.

Joe Bageant was a Marxist and considered himself far left of the far left, but he never forgot his roots in poor white Appalachia. He complained about liberalism and yet at times admitted he was a liberal of sorts—worse still, an educated liberal and an old hippie at that. He was trying to make his voice heard from within the belly of the beast, not observing the beast’s behavior as if a zoologist studying from afar. From the opposite end of the class spectrum, there was someone like Theodore Roosevelt. His class solidarity apparently was a bit lacking, as he didn’t espouse an ideology for the wealthy and business interests. He took socialists seriously, in that he argued they made some valid points. Unlike many mainstream partisans today reacting to the supporters of Sanders and Trump, TR didn’t just dismiss the perceived radicals as loud-mouthed rabble-rousers and malcontents. He argued that many socialists were simply social reformers, not utopian ideologues, and that the issues they brought up should be taken seriously—it being better to allow genuine reform if it prevents violent revolution. Both Bageant and Roosevelt were making the simple point that we should listen to those making complaints and try to understand where they are coming from—i.e., don’t shoot the messenger.

Here is what has been on my mind, a specific demographic that is some combination of middle-to-upper class, well educated, professional, and mostly white. Out of this demographic comes the politicians and activists, community organizers and social workers, intellectuals and academics, writers and artists, musicians and actors, journalists and reporters, etc. Despite being a minority of the population, they have greater power to be heard and influence than all of the rest of the population combined. They are, of course, more economically secure and comfortable than most of the population, along with greater opportunity for economic mobility. They particularly dominate the political and media spheres and so they determine the terms of public debate and controlling the framing of issues and narratives. These are the people who are most invested in the system and likewise benefit the most from the system, but they aren’t the people who experience the greatest harm from and costs of the system.

These are the privileged. These are the people who have the most insulated lives. They either don’t see or don’t understand many of the divides in our society. Certainly, they have little experience of those who live on the other side of those divides. They argue among themselves within a narrow frame of interests and ideas. Even the supposed radicals among them are safely contained within the dominant paradigm. Yet fissures are beginning to form in their walled reality. And the voices from outside are beginning to be heard. This disturbs their comfortable lives and puts them in an irritable mood. They realize their position in the social order is being threatened.

Even so, I don’t get the sense that most of these middle-to-upper class gentlefolk realize how bad it’s gotten for the majority of the population. Some do get it, but many more don’t. When I hear the criticisms of the supporters of Sanders and Trump, it becomes obvious that these critics are oblivious to the point of utter cluelessness. It’s not just economic problems getting worse: increasing poverty among the disadvantaged and growing inequality across society, higher rates of unemployment and underemployment (permanent unemployment no longer even being measured), the falling behind other developed countries in economic mobility along with the size and wealth of the middle class, stagnating or falling real wages and buying power, etc. It’s also a worsening of rates of mental health issues, suicide and other mortality causes, delayed marriage and divorce—and the destruction of all that held the social fabric together: deteriorating tight-knit farming communities and factory towns based on strong local economies, loss of high membership rates in labor unions and civic organizations, undermining of culture of trust and civic participation, weakening of democratic process and representation, disempowerment and disenfranchisement and demoralization of the lower classes, economic segregation and isolation, underfunding of schools and libraries and infrastructure and public services, and so much else.

This isn’t directly impacting most of the people the comfortable middle-to-upper classes. They usually don’t even see it’s impact on others, except occasional reports about it in the news media and occasional portrayals of it in the entertainment media. But even then, no real sense of what it means for those suffering and struggling. When they dismiss demands for reform from those below them, what they don’t understand is that these aren’t unreasonable requests. Those at the bottom of society don’t have the luxury to wait for slowly implemented moderate reforms. The system is broken. For the worst off, this at times can be a life or death situation. Some people are barely hanging on, at the end of their rope. As the economy worsens and the divides widen, desperation gets pushed to the breaking point with the inevitable result of soaring rates of mental health issues and suicides. Push it far enough and you will see even far worse consequences for all of society. The presently comfortable might find themselves increasingly uncomfortable, if they continue to ignore the victims of these oppressive problems. It’s not wise, much less moral and compassionate, to dismiss the pleas of the desperate.

* * *

The Unimagined: Capitalism and Crappiness

The Desperate Acting Desperately

Trends in Depression and Suicide Rates

From Bad to Worse: Trends Across Generations

Republicans: Party of Despair

Rate And Duration of Despair

Our Shared Imagination

Imagination. It’s such a simple thing. Yet so easily misunderstood and forgotten about. We take it for granted because most of time it operates unconsciously.

I try to remind myself of it, so as to avoid falling into a rut of thinking. The society we live in does shape our imaginations, individual and collective. But we must not forget that our imaginations also shape the world around us. This is true for all societies and even more obvious for modern society. We directly shape the world around us. Maybe I have a clear sense of this because I live in the state that has the highest percentage of developed land in the country.

When we claim the power of our own imaginations, it can be empowering. History is full of examples of how imagination leads to great innovations and changes. At the same time, to have an active imagination can be frustrating because it allows for a larger perspective while also allowing for one to more easily switch perspectives.

We live in a time when imagination has become constrained to the most powerful paradigm in human existence, a paradigm that has come to rule minds around the world. It’s not any single ideology in the political sense. We could try to give it names and I have done so at times, but that might end up distracting from the thing itself.

This has been apparent this campaign season. The dominant paradigm remains immensely powerful, even as it is being challenged. More Americans are beginning to feel frustrated and imagination is awakening in response. The challenge for the American public is how to direct imagination toward something positive. It’s easy to tear down the dreams of others, especially for those who stand back refusing to offer anything themselves. Some attack the hopeful for challenging the status quo while others attack them simply out of, I don’t know what it is… fearful reaction? cynicism?

There are many forms of imagination. There is this standard expression of imagination as positive and active envisioning of possibility. I call this radical imagination. It is what fueled many of the revolutions of the past, including the American Revolution. Thomas Paine is my favorite advocate of this radical imagination.

Paine was also a practitioner of a second form of imagination—one might call it critical imagination. It’s the ability uncover what is hidden, to open one’s eyes to what is in front of one’s face. We typically think things are obvious, after we’ve perceived them and made them part of our conscious sense of reality. Yet what later on is considered obvious can go unnoticed and unacknowledged for generations and even centuries. With critical imagination, we step back and hold something at arms length. It allows us to investigate and scrutinize something. It creates space for curiosity and questioning. As a result, we are more able to reinterpet and reframe, to make the familiar unfamiliar.

Critical imagination is necessary, but it can become dysfunctional if that acts in isolation from other forms of imagination. In that case, imagination can become a slave to skepticism and doubt. It is hard to get from this to radical imagination. What is needed is something to draw the psyche back into the world, attract the individual into relationship with what is being confronted. It is to touch and feel, not just to see from a distance.

This is what has been referred to as sympathetic imagination. It’s the ability to imaginatively enter the experience and worldview of others. It isn’t just to understand the way people are but also how they could otherwise be. It means to take people seriously and, in the process, remember the common humanity that connects the individual to society.

This can be the most challenging of imaginative capacities. I’ve struggled with this immensely. It’s no small task to get past differences of culture, religion, political ideology, or whatever else. Instead, we most often fall into generalizations and stereotypes.

One example I came across recently was that of Kenan Malik arguing that Islamic terrorists in Europe had no reasons, that they were nihilists. He made this argument, despite the fact that some of these people had stated their reasons. It is hard for us to imagine that such people who turn to extreme actions are human just like us. To imagine that would mean, under certain circumstances, we too could be pushed to extremes. That is a scary thought, to imagine the dark potentials within our own human nature.

Another recent example is how the supporters of Sanders and Trump have been portrayed in the mainstream, not only by the media but also by the public. It is easier to dismiss people who point out the problems of our society and demand changes. It is easier to shoot the messenger than to listen to the message. To listen, though, would require an act of imagination by stepping outside of the dominant paradigm. Maybe these people who are speaking out in frustration aren’t mere dogmatic ideologues, political purists, impractical idealists, angry malcontents, and cultish followers. Just for a moment imagine they are normal humans who are dealing with difficult situations.

The data supports this. To focus on Sanders’ camp, they aren’t middle class activists complaining that the world doesn’t conform to radical ideologies. Quite the opposite. They are the lowest income group of any supporters among present candidates. And they are an ideologically diverse group that is motivated by a simple desire for reform and they feel inspired by the only major candidate imagining a positive compelling vision of the future.

Even though I’m critical of Donald Trump, I’ve come around to not wanting to criticize his supporters. There are those who falsely portray them as mere ignorant poor whites—actually, they aren’t particularly poor and uneducated compared to the general population. I don’t want to dismiss these people as bigots and proto-fascists. I’m tired of that kind of negative use of imagination. It ends up being a way of avoiding the very problems that frustrate these people, which inevitably makes the problems worse leading to ever greater frustration. It is the mainstream dismissal of such people that, if anything, will set the stage for fascism.

I see this in general with demographics, public opinion, and various labels. Most Americans are far to the left of the political and media elite, but of course the political and media elite present themselves as the social norm that defines any deviation as radical or dangerous. Many average Americans get taken in by this narrative that constrains the public imagination.

Even many people far outside the mainstream get taken in by this. They’ll conflate the so-called liberal class with all liberalism, despite the silenced majority being to the left of this liberal class and much more diverse as well. I hear left-wingers and right-wingers alike, often self-portrayed grim realists, make vague and overgeneralized accusations of ‘liberals’. Much petty nitpicking is involved. I don’t get the sense that many of them even know what liberalism is. It’s just a bogeyman to them, a useful scapegoat as much for the failures of their own ideological thinking. There is no great courage, no profound wisdom in merely attacking others and tearing them down.

This isn’t a failure of a single group. It’s not about a conflict between left vs right. Nor is it even exactly mainstream vs alternative. It’s a collective failure of imagination. We are all complicit.

I’m not sure what is the helpful response to all of this. I struggle with these challenges like anyone else. More times than I’d like to admit, I’ve fallen into these traps of failed imagination. Yet I never give up on imagination. To envision something new is powerful beyond measure. I have this sense that the most destructive claim that can ever be made in a society aspiring to freedom is that something is ‘unrealistic’ and ‘impractical’. That is the death knell of imagination. This is not a time to sell ourselves short.

If we were to let loose the reigns of our minds, where might new possibilities lead us. Yes, have the vision to see clearly what’s wrong, but then take that same vision to see our common fate and our shared potential. It’s not only that we should dare to imagine. We must not forget that imagination is as necessary as the air we breathe. We are always imagining. The trick is to learn to dream while awake.

An Invisible Debt Made Visible

Externalized costs have been on my mind for a very long time. Ours is a self-enclosed biosphere. All costs are ultimately internal, no matter how much we pretend otherwise.

My sense of the political has been rooted in environmentalism, from early on in my life. This worldview has been informed by a larger environmental sense of the world, including the social and economic environment. It’s always been how I experience reality, as something far beyond the false divisions we create and reify—between individual and collective, self and world, society and nature.

My young political sensibility was expressed in school papers I wrote about externalized costs, a gut level intuition about what was being lost. These papers were about overpopulation and pollution. Now here I am as an adult and everything has gotten far worse, some might say beyond the point of no return.

Pollution and environmental destruction knows no boundary. The natural world cares not about our ideological beliefs. It doesn’t matter who is fault when the costs come due. The free market is and always was bullshit. Nothing is free, even if we don’t see the price tag. In fact, capitalism is rather costly. The ultimate cost might be greater than we can afford.

These costs are highly personal. I’ve talked many times about lead toxicity, the costs of which are numerous and yet still measurable. For every IQ point lost to lead toxicity, it is a specific amount of money lost in lifetime earnings. Multiply that by many IQ points lost for untold millions of people. The costs are devastating and that is considering just one of many costs.

Considering all pollution and environmental degradation, that is the cause of 40% of the deaths worldwide. Those deaths include working men and women who were helping care for family members. Those deaths represent human potential thrown away. Those deaths didn’t just happen instantly but followed years or even decades of illnesses, suffering, and healthcare costs.

Other costs are also economic on the larger scale, which also can be measured. For a long time, I’ve suspected that many corporations would go bankrupt if they were ever forced to pay for their externalized costs. This was shown to be the case with a recent UN report:

“The report found that when you took the externalized costs into effect, essentially NONE of the industries was actually making a profit. The huge profit margins being made by the world’s most profitable industries (oil, meat, tobacco, mining, electronics) is being paid for against the future: we are trading long term sustainability for the benefit of shareholders. Sometimes the environmental costs vastly outweighed revenue, meaning that these industries would be constantly losing money had they actually been paying for the ecological damage and strain they were causing.”

This means these industries are environmentally a net loss to the global society. They aren’t contributing more to society than they are taking away. All the rhetoric of capitalism, meritocracy, and progress is lies built upon lies.

We obsess about individual problems when that isn’t the real danger we face. We make people feel guilty about recycling at home while corporations throw out so much potential recyclables as to make all the rest look minuscule. Similarly, almost all the pollution comes from big biz, not from people driving their cars too much or whatever. If we wanted to make a dent in these problems, we’d tackle it at the largest level of the most major contributors to these problems, instead of tinkering around the edges.

Meanwhile, these companies that profit from human misery, from the forced sacrifice of present and future generations lobby the world’s governments so that they’ll make even greater profit. They get tax breaks and subsidies. They hide their profits in fake businesses and secret overseas accounts. We debate about whether taxes are too high when any rational and moral person is forced to admit that taxes don’t come close to offsetting all the costs these filthy rich corporations force onto the rest of society.

Why do we tolerate this? Are we mentally deranged? Are we suicidal?

If the unsustainable costs of industrial externalities doesn’t incite mass outrage and force systemic global reform, then there is no hope left for humanity. We are doomed. Saving capitalism from communism will be the least of anyone’s worries.

Is anyone paying attention? It’s only the survival of civilization as we know it. No biggie. Have we grown so cynical and fearful that we can’t even face reality barreling down on us like a freight train? We are looking at a nightmare scenario.

Costs can be externalized and deferred. But costs can’t be denied.
Even if we are lucky enough to die before costs become due, do we really want to be such sociopathic assholes in the legacy we leave for the coming generations, for our children and grandchildren? They will curse us for what we did and failed to do.

We will be among the most hated generations ever born. There will be no forgiveness for us. Memorials will be built in memory of the evil we committed and the destruction we caused.

Of course, we could in this moment begin to lessen some of this harm. We could prepare for the consequences we’ve unleashed. We could give these next generations a fighting chance. Will we?