Erosion of the Bronze Age

I’ve previously made an argument about the development of large-scale agriculture in the late Bronze Age. It may have helped cause a psychological transformation that preceded the societal collapse. The late Bronze Age empires became too large to be sustainable, specifically according to the social order that had developed (i.e., Julian Jaynes’ theory of the bicameral mind).

Prior to this, the Bronze Age had been dominated by smaller city-states that were spread further apart. They had some agriculture but still with heavy reliance on hunting, fishing, trapping, and gathering. It would have been a low-carb, high-fat diet. But growing populations, as time went on, became ever more dependent upon agriculture. This meant a shift toward an increasingly high-carb diet that was much less nutrient-dense, along with a greater prevalence of addictive substances.

Agriculture may have had other impacts as well. The appearance of a more fully agricultural diet meant the need for vaster areas to farm. The only way to accomplish that was deforestation. Along with the destabilizing of psychological changes, there also would have been the destabilizing forces of erosion. Then a perfect storm of environmental stressors hit in a short period of time: volcanoes, earthquakes, tidal waves, flooding, etc. With waves of refugees and marauders, the already weakened empires fell like dominoes.

Erosion probably had been making the farmland less fertile for centuries. This wasn’t too much of a problem until overpopulation reached a breaking point. Small yields for multiple years in a row no doubt left grain reserves depleted. Much starvation would have followed. And the already sickly agricultural populations would have fell prey to plagues.

This boom and bust cycle of agricultural civilizations would repeat throughout history. And it often would coincide with major changes in psychology and social order. Our own civilization appears to be coming near the end of a boom period. Erosion is now happening faster and at a larger scale than seen with any prior civilization. But like the archaic bicameral societies, we are trapped by our collective mentality and can’t imagine how to change.

* * *

Trees, the ancient Macedonians, and the world’s first environmental disaster
by Anthony Dosseto and Alex Francke

Recently, we have studied sediments from Lake Dojran, straddling the border between Northern Macedonia and Greece. We looked at the past 12,000 years of sediment archive and found about 3,500 years ago, a massive erosion event happened.

Pollen trapped in the lake’s sediment suggests this is linked to deforestation and the introduction of agriculture in the region. Macedonian timber was highly praised for ship building at the time, which could explain the extent of deforestation.

A massive erosion event would have catastrophic consequences for agriculture and pasture. Interestingly, this event is followed by the onset of the so-called Greek “Dark Ages” (3,100 to 2,850 years ago) and the demise of the highly sophisticated Bronze Age Mycenaean civilisation.

Technological Fears and Media Panics

“One of the first characteristics of the first era of any new form of communication is that those who live through it usually have no idea what they’re in.”
~Mitchell Stephens

“Almost every new medium of communication or expression that has appeared since the dawn of history has been accompanied by doomsayers and critics who have confidently predicted that it would bring about The End of the World as We Know It by weakening the brain or polluting our precious bodily fluids.”
~New Media Are Evil, from TV Tropes

“The internet may appear new and fun…but it’s really a porn highway to hell. If your children want to get on the internet, don’t let them. It’s only a matter of time before they get sucked into a vortex of shame, drugs, and pornography from which they’ll never recover. The internet…it’s just not worth it.”
~Grand Theft Auto: Liberty City Stories

“It’s the same old devil with a new face.”
~Rev. George Bender, Harry Potter book burner

Media technology is hard to ignore. This goes beyond it being pervasive. Our complaints and fears, our fascination and optimism are mired in far greater things. It is always about something else. Media technology is not only the face of some vague cultural change but the embodiment of new forms of power that seem uncontrollable. Our lives are no longer fully our own, a constant worry in an individualistic society. With globalization, it’s as if the entire planet has become a giant company town.

I’m not one for giving into doom and gloom about technology. That response is as old as civilization and doesn’t offer anything useful. But I’m one of the first to admit to the dire situation we are facing. It’s just that in some sense the situation has always been dire, the world has always been ending. We never know if this finally will be the apocalypse that has been predicted for millennia, an ending to end it all with no new beginning. One way or another, the world as we know it is ending. There probably isn’t much reason to worry about it. Whatever the future holds, it is beyond our imagining as our present world was beyond the imagining of past generations.

One thing is clear. There is no point in getting in a moral panic over it. The young who embrace what is new always get blamed for it, even though they are simply inheriting what others have created. The youth today aren’t any worse off than any other prior generation at the same age. Still, it’s possible that these younger generations might take us into a future that us old fogies won’t be able to understand. History shows how shocking innovations can be. Talking about panics, think about Orson Welles’s radio show, War of the Worlds. The voice of radio back then had a power that we no longer can appreciate. Yet here we are with radio being so much background noise added to the rest.

Part of what got me thinking about this were two posts by Matt Cardin, at The Teeming Brain blog. In one post, he shares some of Nathaniel Rich’s review, Roth Agonistes, of Philip Roth’s Why Write?: Collected Nonfiction 1960–2013. There is a quote from Roth in 1960:

“The American writer in the middle of the twentieth century has his hands full in trying to understand, describe, and then make credible much of American reality. It stupefies, it sickens, it infuriates, and finally it is even a kind of embarrassment to one’s own meager imagination. The actuality is continually outdoing our talents, and the culture tosses up figures almost daily that are the envy of any novelist.”

Rich comments that, “Roth, despite writing before the tumult of the Sixties, went farther, suggesting that a radically destabilized society had made it difficult to discriminate between reality and fiction. What was the point of writing or reading novels when reality was as fantastic as any fiction? Such apprehensions may seem quaint when viewed from the comic-book hellscape of 2018, though it is perversely reassuring that life in 1960 felt as berserk as it does now.”

We are no more post-truth now than back then. It’s always been this way. But it is easy to lose context. Rich notes that, “Toward the end of his career, in his novels and public statements, Roth began to prophesy the extinction of a literary culture — an age-old pastime for aging writers.” The ever present fear that the strangeness and stresses of the unknown will replace the comforting of the familiar. We all grow attached to the world we experienced in childhood, as it forms the foundation of our identity. But every now and then something comes along to threaten it all. And the post-World War era was definitely a time of dramatic and, for some, traumatic change — despite all of the nostalgia that has accrued to its memories like flowers on a gravestone.

The technological world we presently live in took its first form during that earlier era. Since then, the book as an art form is far from being near extinction. More books have been printed in recent decades than ever before in history. New technology has oddly led us read even more books, both in their old and new technological forms. My young niece, of the so-called Internet Generation, prefers physical books… not that she is likely to read Philip Roth. Literacy, along with education and IQ, is on the rise. There is more of everything right now, what makes it overwhelming. Technologies of the past for the  most part aren’t being replaced but incorporated into a different world. This Borg-like process of assimilation might be more disturbing to the older generations than simply becoming obsolete.

The other post by Matt Cardin shares an excerpt from an NPR piece by Laura Sydell, The Father Of The Internet Sees His Invention Reflected Back Through A ‘Black Mirror’. It is about the optimists of inventors and the consequences of inventions, unforeseen except by a few. One of those who did see the long term implications was William Gibson: “The first people to embrace a technology are the first to lose the ability to see it objectively.” Maybe so, but that is true for about everyone, including most of those who don’t embrace it or go so far as to fear it. It’s not in human nature to see much of anything objectively.

Gibson did see the immediate realities of what he coined as ‘Cyberspace’. We do seem to be moving in that general direction of cyberpunk dystopia, at least here in this country. I’m less certain about the even longer term developments, as Gibson’s larger vision is as fantastical as many others. But it is the immediate realities that always concern people because they can be seen and felt, if not always acknowledged for what they are, often not even by the fear-mongers.

I share his being more “interested in how people behave around new technologies.” In reference to “how TV changed New York City neighborhoods in the 1940s,” Gibson states that, “Fewer people sat out on the stoops at night and talked to their neighbors, and it was because everyone was inside watching television. No one really noticed it at the time as a kind of epochal event, which I think it was.”

I would make two points about.

First, there is what I already said. It is always an epochal event when a major technology is invented, going back to the many inventions before that such as media technology (radio, films, telegraph, printing press, bound book, etc) but also other technologies (assembly lines, cotton gin, compass, etc). Did the Chinese individual who assembled the first firework imagine the carnage of bombs that made castles easy targets and led to two world wars that transformed all of human existence? Of course not. Even the simplest of technologies can turn civilization on its head, which has happened multiple times over the past few millennia and often with destructive results.

The second point is to look at something specific like television. It happened along with the building of the interstate highway system, the rise of car culture, and the spread of suburbia. Television became a portal for the outside world to invade the fantasyland of home life that took hold after the war. Similar fears about radio and the telephone were transferred to the television set and those fears were directed at the young. The first half of the 20th century was constant technological wonder and uncertainty. The social order was thrown askew.

We like to imagine the 1940s and 1950s as a happy time of social conformity and social order, a time of prosperity and a well behaved population, but that fantasy didn’t match the reality. It was an era of growing concerns about adolescent delinquency, violent crime, youth gangs, sexual deviancy, teen pregnancy, loose morals, and rock ‘n roll — and the data bears out that a large number in that generation were caught up in the criminal system, whether because they were genuinely a bad generation or that the criminal system had become more punitive, although others have argued that it was merely a side effect of the baby boom with youth making up a greater proportion of society. Whatever was involved, the sense of social angst got mixed up with lingering wartime trauma and emerging Cold War paranoia. The policing, arrests, and detention of wayward youth became a priority to the point of oppressive obsession. Besides youth problems, veterans from World War II did not come home content and happy (listen to Audible’s “The Home Front”). It was a tumultuous time, quite opposite of the perfect world portrayed in those family sitcoms of the 1940s and 1950s.

The youth during that era had a lot in common with their grandparents, the wild and unruly Lost Generation corrupted by family and community breakdown from early mass immigration, urbanization, industrialization, consumerism, etc. Starting in the late 1800s, youth gangs and hooliganism became rampant, as moral panic became widespread. As romance novels earlier had been blamed and later comic books would be blamed, around the turn of the century the popular media most feared were the violent penny dreadfuls and dime novels that targeted tender young minds with portrayals of lawlessness and debauchery, so it seemed to the moral reformers and authority figures.

It was the same old fear rearing its ugly head. This pattern has repeated on a regular basis. What new technology does is give an extra push to the swings of generational cycles. So, as change occurs, much remains the same. For all that William Gibson got right, no one can argue that the world has been balkanized into anarcho-corporatist city-states (Snow Crash), although it sure is a plausible near future. The general point is true, though. We are a changed society. Yet the same old patterns of fear-mongering and moral panic continue. What is cyclical and what is trend is hard to differentiate as it happens, it being easier to see clearly in hindsight.

I might add that vast technological and social transformations have occurred every century for the past half millennia. The ending of feudalism was far more devastating. Much earlier, the technological advancement of written text and the end of oral culture had greater consequences than even Socrates could have predicted. And it can’t be forgotten that movable type printing presses ushered in centuries of mass civil unrest, populist movements, religious wars, and revolution across numerous countries.

Our own time so far doesn’t compare, one could argue. The present relative peace and stability will continue until maybe World War III and climate change catastrophe forces a technological realignment and restructuring of civilization. Anyway, the internet corrupting the youth and smart phones rotting away people’s brains should be the least of our worries.

Even the social media meddling that Russia is accused of in manipulating the American population is simply a continuation of techniques that go back to before the internet existed. The game has changed a bit, but nations and corporations are pretty much acting in the devious ways they always have, except they are collecting a lot more info. Admittedly, technology does increase the effectiveness of their deviousness. But it also increases the potential methods for resisting and revolting against oppression.

I do see major changes coming. My doubts are more about how that change will happen. Modern civilization is massively dysfunctional. That we use new technologies less than optimally might have more to do with pre-existing conditions of general crappiness. For example, television along with air conditioning likely did contribute to people not sitting outside and talking to their neighbors, but as great or greater of a contribution probably had to do with diverse social and economic forces driving shifts in urbanization and suburbanization with the dying of small towns and the exodus from ethnic enclaves. Though technology was mixed into these changes, we maybe give technology too much credit and blame for the changes that were already in motion.

It is similar to the shift away from a biological explanation of addiction. It’s less that certain substances create uncontrollable cravings. Such destructive behavior is only possible and probable when particular conditions are set in place. There already has to be breakdown of relationships of trust and support. But rebuild those relationships and the addictive tendencies will lessen.

Similarly, there is nothing inevitable about William Gibson’s vision of the future or rather his predictions might be more based on patterns in our society than anything inherent to the technology itself. We retain the choice and responsibility to create the world we want or, failing that, to fall into self-fulfilling prophecies.

The question is what is the likelihood of our acting with conscious intention and wise forethought. All in all, self-fulfilling prophecy appears to be the most probable outcome. It is easy to be cynical, considering the track record of the present superpower that dominates the world and the present big biz corporatism that dominates the economy. Still, I hold out for the chance that conditions could shift for various reasons, altering what otherwise could be taken as near inevitable.

* * *

6/13/21 – Here is an additional thought that could be made into a new separate post, but for now we’ll leave it here as a note. There is evidence that new media technology does have an effect on the thought, perception, and behavior. This is measurable in brain scans. But other research shows it even alters personality or rather suppresses it’s expression. In a scientific article about testing for the Big Five personality traits, Tim Blumer and Nicola Döring offer an intriguing conclusion:

“To sum up, we conclude that for four of the five factors the data indicates a decrease of personality expression online, which is most probably due to the specification of the situational context. With regard to the trait of neuroticism, however, an additional effect occurs: The emotional stability increases on the computer and the Internet. This trend is likely, as has been described in previous studies, due to the typical features of computer-mediated communication (see Rice & Markey, 2009)” (Are we the same online? The expression of the five factor personality traits on the computer and the Internet).

This makes one think what it actually means. These personality tests are self-reports and so have that bias. Still, that is useful info in indicating what people are experiencing and perceiving about themselves. It also gives some evidence to what people are expressing, even when they aren’t conscious of it, as that is how these tests are designed with carefully phrased questions and including decoy questions.

It is quite likely that the personality is genuinely being suppressed when people engage with the internet. Online experience eliminates so many normal behavioral and biological cues (tone of voice, facial expressions, eye gaze, hand gestures, bodily posture, rate of breathing, pheromones, etc). It would be unsurprising if this induces at least mild psychosis in many people, in that individuals literally become disconnected from most aspects of normal reality and human relating. If nothing else, it would surely increase anxiety and agitation.

When online, we really aren’t ourselves. That is because we are cut off from the social mirroring that allows self-awareness. There is a theory that theory of mind and hence cognitive empathy is developed in childhood first through observing others. It’s in becoming aware that others have minds behind their behavior that we then develop a sense of our own mind as separate from others and the world.

The new media technologies remove the ability to easily sense others as actual people. This creates a strange familiarity and intimacy as, in a way, all of the internet is inside of you. What is induced can at times be akin to psychological solipsism. We’ve noticed how often people don’t seem to recognize the humanity of others online in the way they would if a living-and-breathing person were right in front of them. Most of the internet is simply words. And even pictures and videos are isolated of all real-world context and physical immediacy.

Yet it’s not clear we can make a blanket accusation about the risks of new media. Not all aspects are the same. In fact, one study indicated “a positive association between general Internet use, general use of social platforms and Facebook use, on the one hand, and self-esteem, extraversion, narcissism, life satisfaction, social support and resilience, on the other hand.” It’s not merely being on the internet that is the issue but specific platforms of interaction and how they shape human experience and behavior.

The effects varied greatly, as the researchers found: “Use of computer games was found to be negatively related to these personality and mental health variables. The use of platforms that focus more on written interaction (Twitter, Tumblr) was assumed to be negatively associated with positive mental health variables and significantly positively with depression, anxiety, and stress symptoms. In contrast, Instagram use, which focuses more on photo-sharing, correlated positively with positive mental health variables” (Julia Brailovskaia et al, What does media use reveal about personality and mental health? An exploratory investigation among German students).

There is good evidence. The video game result is perplexing, though. It is image-based, as is Instagram. Why does the former lead to less optimal outcomes and the the latter not? It might have to do with Instagram being more socially-oriented, whereas video games can be played in isolation. Would that still be true of video games that are played with friends and/or on multi-user online worlds? Anyway, it is unsurprising that text-based social media is clearly a net loss for mental health. That would definitely fit with the theory that it’s particularly something about the disconnecting effect of words alone on a screen.

This fits our own experience even when interacting with people we’ve personally known for most or all of our lives. It’s common to write something to a family member or an old friend on email, text message, FB messenger, etc; and, knowing they received it and read it, receive no response; not even a brief acknowledgement. No one in the “real world” would act that way to “real people”. No one who wanted to maintain a relationship would stand near you while you spoke to them to their face, not look at you or say anything, and then walk away like you weren’t there. But that is the point of the power of textual derealization.

Part of this is the newness of new media. We simply have not yet fully adapted to it. People freaked out about every media innovation that came along. And no doubt the fears and anxiety were often based on genuine concerns and direct observations. When media changes, it does have profound effect on people and society. People probably do act deranged for a period of time. It might take generations or centuries for society to settle down after each period of change. That is the problem with the modern world where we’re hit by such a vast number of innovations in such quick succession. The moment we regain our senses enough to try to stand back up again we are hit by the next onslaught.

We are in the middle of what one could call the New Media Derangement Syndrome (NMDS). It’s not merely one thing but a thousands things and combined with a total technological overhaul of society. This past century has turned all of civilization on its head. Over a few generations, most of it occurring within a single lifespan, humanity went from mostly rural communities, farm-based economy, horse-and-buggies, books, and newspapers to mass urbanization, skyscrapers, factories, trains, cars, trucks, ocean liners, airplanes and jets, rocket ships, electricity, light bulbs, telegraphs, movies, radio, television, telephones, smartphones, internet, radar, x-rays, air conditioning, etc.

The newest of new media is bringing in a whole other aspect. We are now living in not just a banana republic but inverted totalitarianism. Unlike the past, the everyday experience of our lives are more defined by corporations than churches, communities, or governments. Think of how most people spend most of their waking hours regularly checking into various corporate media technologies, platforms, and networks; including while at work and a smartphone next to the bed giving one instant notifications.

Think about how almost all media that Americans now consume is owned and controlled by a handful of transnational corporations. Yet not that long ago, most media was owned and operated locally by small companies, non-profit organizations, churches, etc. Most towns had multiple independently-run newspapers. Likewise, most radio shows and tv shows were locally or regionally produced mid-20th century. The moveable type printing press made possible the first era of mass media, but that was small time change compared to the nationalization and globalization of mass media over the past century.

Part of NMDS is that we consumer-citizens have become commodified products. The social media and such is not a product that we are buying. No, we and our data is the product that is being sold. The crazifaction factor is how everything has become manipulated by data-gathering and algorithms. Corporations now have larger files on American citizens than the FBI did during the height of the Cold War. New media technology is one front of the corporate war on democracy and the public good. Economics is now the dominant paradigm of everything.  In her article Social Media Is a Borderline Personality Disorder, Kasia Chojecka wrote:

“Social media, as Tristan Harris said, undermined human weaknesses and contributed to what can be called a collective depression. I would say it’s more than that — a borderline personality disorder with an emotional rollercoaster, lack of trust, and instability. We can’t stay sane in this world right now, Harris said. Today the world is dominated not only by surveillance capitalism based on commodification and commercialization of personal data (Sh. Zuboff), but also by a pandemic, which caused us to shut ourselves at home and enter a full lockdown mode. We were forced to move to social media and are now doomed to instant messaging, notifications, the urge to participate. The scale of exposure to social media has grown incomparably (the percentages of growth vary — some estimate it would be circa ten percent or even several dozen percent in comparison to 2019).”

The result of this is media-induced mass insanity can be seen with conspiracy theories (QAnon, Pizzagate, etc) and related mass psychosis and mass hallucinations: Jewish lasers in space starting wildfires, global child prostitution rings that operate on moon bases, vast secret tunnel systems that connect empty Walmarts, and on and on. That paranoia, emerging from the dark corners of the web, helped launch an insurrection against the government and caused the attempted kidnappings and assassinations of politicians. Plus, it got a bizarre media personality cult elected as president. If anyone doubted the existence of NMDS in the past, it has since become the undeniable reality we all now live in.

* * *

Fear of the new - a techno panic timeline

11 Examples of Fear and Suspicion of New Technology
by Len Wilson

New communications technologies don’t come with user’s manuals. They are primitive, while old tech is refined. So critics attack. The critic’s job is easier than the practitioner’s: they score with the fearful by comparing the infancy of the new medium with the perfected medium it threatens. But of course, the practitioner wins. In the end, we always assimilate to the new technology.

“Writing is a step backward for truth.”
~Plato, c. 370 BC

“Printed book will never be the equivalent of handwritten codices.”
~Trithemius of Sponheim, 1492

“The horrible mass of books that keeps growing might lead to a fall back into barbarism..”
~Gottfried Wilhelm, 1680

“Few students will study Homer or Virgil when they can read Tom Jones or a thousand inferior or more dangerous novels.”
~Rev. Vicemius Know, 1778

“The most powerful of ignorance’s weapons is the dissemination of printed matter.”
~Count Leo Tolstoy, 1869

“We will soon be nothing but transparent heaps of jelly to each other.”
~New York Times 1877 Editorial, on the advent of the telephone

“[The telegraph is] a constant diffusion of statements in snippets.”
~Spectator Magazine, 1889

“Have I done the world good, or have I added a menace?”
~Guglielmo Marconi, inventor of radio, 1920

“The cinema is little more than a fad. It’s canned drama. What audiences really want to see is flesh and blood on the stage.”
~Charlie Chaplin, 1916

“There is a world market for about five computer.”
~Thomas J. Watson, IBM Chairman and CEO, 1943

“Television won’t be able to hold on to any market it captures after the first six months. People will soon get tired of staring at a plywood box every night.”
~Daryl Zanuck, 20th Century Fox CEO, 1946

Media Hysteria: An Epidemic of Panic
by Jon Katz

MEDIA HYSTERIA OCCURS when tectonic plates shift and the culture changes – whether from social changes or new technology.

It manifests itself when seemingly new fears, illnesses, or anxieties – recovered memory, chronic fatigue syndrome, alien abduction, seduction by Internet molesters, electronic theft – are described as epidemic disorders in need of urgent recognition, redress, and attention.

For those of us who live, work, message, or play in new media, this is not an abstract offshoot of the information revolution, but a topic of some urgency: We are the carriers of these contagious ideas. We bear some of the responsibility and suffer many of the consequences.

Media hysteria is part of what causes the growing unease many of us feel about the toxic interaction between technology and information.

Moral Panics Over Youth Culture and Video Games
by Kenneth A. Gagne

Several decades of the past century have been marked by forms of entertainment that were not available to the previous generation. The comic books of the Forties and Fifties, rock ‘n roll music of the Fifties, Dungeons & Dragons in the Seventies and Eighties, and video games of the Eighties and Nineties were each part of the popular culture of that era’s young people. Each of these entertainment forms, which is each a medium unto itself, have also fallen under public scrutiny, as witnessed in journalistic media such as newspapers and journals – thus creating a “moral panic.”

The Smartphone’s Impact is Nothing New
by Rabbi Jack Abramowitz

Any invention that we see as a benefit to society was once an upstart disruption to the status quo. Television was terrible because when listened to the radio, we used our imaginations instead of being spoon-fed. Radio was terrible because families used to sit around telling stories. Moveable type was terrible because if books become available to the masses, the lower classes will become educated beyond their level. Here’s a newsflash: Socrates objected to writing! In The Phaedrus (by his disciple Plato), Socrates argues that “this discovery…will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. … (Y)ou give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.”

When the Internet and the smartphone evolved, society did what we always do: we adapted. Every new technology has this effect. Do you know why songs on the radio are about 3½ minutes long? Because that’s what a 45-rpm record would hold. Despite the threat some perceived in this radical format, we adapted. (As it turns out, 45s are now a thing of the past but the pop song endures. Turns out we like 3½-minute songs!)

Is the Internet Making Us Crazy? What the New Research Says
by Tony Dokoupil

The first good, peer-reviewed research is emerging, and the picture is much gloomier than the trumpet blasts of Web utopians have allowed. The current incarnation of the Internet—portable, social, accelerated, and all-pervasive—may be making us not just dumber or lonelier but more depressed and anxious, prone to obsessive-compulsive and attention-deficit disorders, even outright psychotic. Our digitized minds can scan like those of drug addicts, and normal people are breaking down in sad and seemingly new ways. […]

And don’t kid yourself: the gap between an “Internet addict” and John Q. Public is thin to nonexistent. One of the early flags for addiction was spending more than 38 hours a week online. By that definition, we are all addicts now, many of us by Wednesday afternoon, Tuesday if it’s a busy week. Current tests for Internet addiction are qualitative, casting an uncomfortably wide net, including people who admit that yes, they are restless, secretive, or preoccupied with the Web and that they have repeatedly made unsuccessful efforts to cut back. But if this is unhealthy, it’s clear many Americans don’t want to be well. [,,,]

The Gold brothers—Joel, a psychiatrist at New York University, and Ian, a philosopher and psychiatrist at McGill University—are investigating technology’s potential to sever people’s ties with reality, fueling hallucinations, delusions, and genuine psychosis, much as it seemed to do in the case of Jason Russell, the filmmaker behind “Kony 2012.” The idea is that online life is akin to life in the biggest city, stitched and sutured together by cables and modems, but no less mentally real—and taxing—than New York or Hong Kong. “The data clearly support the view that someone who lives in a big city is at higher risk of psychosis than someone in a small town,” Ian Gold writes via email. “If the Internet is a kind of imaginary city,” he continues. “It might have some of the same psychological impact.”

What parallels do you see between the invention of the internet – the ‘semantic web’ and the invention of the printing press?
answer by Howard Doughty

Technology, and especially the technology of communication, has tremendous consequences for human relations – social, economic and political.

Socrates raged against the written word, insisting that it was the end of philosophy which, in his view, required two or more people in direct conversation. Anything else, such as a text, was at least one step removed from the real thing and, like music and poetry which he also despised, represented a pale imitation (or bastardization) of authentic life. (Thank goodness Plato wrote it all down.)

From an oral to a written society was one thing, but as Marshall McLuhan so eruditely explained in his book, The Gutenberg Galaxy, the printing press altered fundamantal cultural patterns again – making reading matter more easily available and, in the process, enabling the Protestant Reformation and its emphasis on isolated individual interpretations of whatever people imagined their god to be.

In time, the telegraph and the telephone began the destruction of space, time and letter writing, making it possible to have disembodied conversations over thousands of miles.

Don’t Touch That Dial!
by Vaughan Bell

A respected Swiss scientist, Conrad Gessner, might have been the first to raise the alarm about the effects of information overload. In a landmark book, he described how the modern world overwhelmed people with data and that this overabundance was both “confusing and harmful” to the mind. The media now echo his concerns with reports on the unprecedented risks of living in an “always on” digital environment. It’s worth noting that Gessner, for his part, never once used e-mail and was completely ignorant about computers. That’s not because he was a technophobe but because he died in 1565. His warnings referred to the seemingly unmanageable flood of information unleashed by the printing press.

Worries about information overload are as old as information itself, with each generation reimagining the dangerous impacts of technology on mind and brain. From a historical perspective, what strikes home is not the evolution of these social concerns, but their similarity from one century to the next, to the point where they arrive anew with little having changed except the label.

 These concerns stretch back to the birth of literacy itself. In parallel with modern concerns about children’s overuse of technology, Socrates famously warned against writing because it would “create forgetfulness in the learners’ souls, because they will not use their memories.” He also advised that children can’t distinguish fantasy from reality, so parents should only allow them to hear wholesome allegories and not “improper” tales, lest their development go astray. The Socratic warning has been repeated many times since: The older generation warns against a new technology and bemoans that society is abandoning the “wholesome” media it grew up with, seemingly unaware that this same technology was considered to be harmful when first introduced.

Gessner’s anxieties over psychological strain arose when he set about the task of compiling an index of every available book in the 16th century, eventually published as the Bibliotheca universalis. Similar concerns arose in the 18th century, when newspapers became more common. The French statesman Malesherbes railed against the fashion for getting news from the printed page, arguing that it socially isolated readers and detracted from the spiritually uplifting group practice of getting news from the pulpit. A hundred years later, as literacy became essential and schools were widely introduced, the curmudgeons turned against education for being unnatural and a risk to mental health. An 1883 article in the weekly medical journal the Sanitarian argued that schools “exhaust the children’s brains and nervous systems with complex and multiple studies, and ruin their bodies by protracted imprisonment.” Meanwhile, excessive study was considered a leading cause of madness by the medical community.

When radio arrived, we discovered yet another scourge of the young: The wireless was accused of distracting children from reading and diminishing performance in school, both of which were now considered to be appropriate and wholesome. In 1936, the music magazine the Gramophone reported that children had “developed the habit of dividing attention between the humdrum preparation of their school assignments and the compelling excitement of the loudspeaker” and described how the radio programs were disturbing the balance of their excitable minds. The television caused widespread concern as well: Media historian Ellen Wartella has noted how “opponents voiced concerns about how television might hurt radio, conversation, reading, and the patterns of family living and result in the further vulgarization of American culture.”

Demonized Smartphones Are Just Our Latest Technological Scapegoat
by Zachary Karabell

AS IF THERE wasn’t enough angst in the world, what with the Washington soap opera, #MeToo, false nuclear alerts, and a general sense of apprehension, now we also have a growing sense of alarm about how smartphones and their applications are impacting children.

In the past days alone, The Wall Street Journal ran a long story about the “parents’ dilemma” of when to give kids a smartphone, citing tales of addiction, attention deficit disorder, social isolation, and general malaise. Said one parent, “It feels a little like trying to teach your kid how to use cocaine, but in a balanced way.” The New York Times ran a lead article in its business section titled “It’s Time for Apple to Build a Less Addictive iPhone,” echoing a rising chorus in Silicon Valley about designing products and programs that are purposely less addictive.

All of which begs the question: Are these new technologies, which are still in their infancy, harming a rising generation and eroding some basic human fabric? Is today’s concern about smartphones any different than other generations’ anxieties about new technology? Do we know enough to make any conclusions?

Alarm at the corrosive effects of new technologies is not new. Rather, it is deeply rooted in our history. In ancient Greece, Socrates cautioned that writing would undermine the ability of children and then adults to commit things to memory. The advent of the printing press in the 15th century led Church authorities to caution that the written word might undermine the Church’s ability to lead (which it did) and that rigor and knowledge would vanish once manuscripts no longer needed to be copied manually.

Now, consider this question: “Does the telephone make men more active or more lazy? Does [it] break up home life and the old practice of visiting friends?” Topical, right? In fact, it’s from a 1926 survey by the Knights of Columbus about old-fashioned landlines.

 The pattern of technophobia recurred with the gramophone, the telegraph, the radio, and television. The trope that the printing press would lead to loss of memory is very much the same as the belief that the internet is destroying our ability to remember. The 1950s saw reports about children glued to screens, becoming more “aggressive and irritable as a result of over-stimulating experiences, which leads to sleepless nights and tired days.” Those screens, of course, were televisions.

Then came fears that rock-n-roll in the 1950s and 1960s would fray the bonds of family and undermine the ability of young boys and girls to become productive members of society. And warnings in the 2000s that videogames such as Grand Theft Auto would, in the words of then-Senator Hillary Rodham Clinton, “steal the innocence of our children, … making the difficult job of being a parent even harder.”

Just because these themes have played out benignly time and again does not, of course, mean that all will turn out fine this time. Information technologies from the printed book onward have transformed societies and upended pre-existing mores and social order.

Protruding Breasts! Acidic Pulp! #*@&!$% Senators! McCarthyism! Commies! Crime! And Punishment!
by R.C. Baker

In his medical practice, Wertham saw some hard cases—juvenile muggers, murderers, rapists. In Seduction, he begins with a gardening metaphor for the relationship between children and society: “If a plant fails to grow properly because attacked by a pest, only a poor gardener would look for the cause in that plant alone.” He then observes, “To send a child to a reformatory is a serious step. But many children’s-court judges do it with a light heart and a heavy calendar.” Wertham advocated a holistic approach to juvenile delinquency, but then attacked comic books as its major cause. “All comics with their words and expletives in balloons are bad for reading.” “What is the social meaning of these supermen, super women … super-ducks, super-mice, super-magicians, super-safecrackers? How did Nietzsche get into the nursery?” And although the superhero, Western, and romance comics were easily distinguishable from the crime and horror genres that emerged in the late 1940s, Wertham viewed all comics as police blotters. “[Children] know a crime comic when they see one, whatever the disguise”; Wonder Woman is a “crime comic which we have found to be one of the most harmful”; “Western comics are mostly just crime comic books in a Western setting”; “children have received a false concept of ‘love’ … they lump together ‘love, murder, and robbery.’” Some crimes are said to directly imitate scenes from comics. Many are guilty by association—millions of children read comics, ergo, criminal children are likely to have read comics. When listing brutalities, Wertham throws in such asides as, “Incidentally, I have seen children vomit over comic books.” Such anecdotes illuminate a pattern of observation without sourcing that becomes increasingly irritating. “There are quite a number of obscure stores where children congregate, often in back rooms, to read and buy secondhand comic books … in some parts of cities, men hang around these stores which sometimes are foci of childhood prostitution. Evidently comic books prepare the little girls well.” Are these stores located in New York? Chicago? Sheboygan? Wertham leaves us in the dark. He also claimed that powerful forces were arrayed against him because the sheer number of comic books was essential to the health of the pulp-paper manufacturers, forcing him on a “Don Quixotic enterprise … fighting not windmills, but paper mills.”

When Pac-Man Started a National “Media Panic”
by Michael Z. Newman

This moment in the history of pop culture and technology might have seemed unprecedented, as computerized gadgets were just becoming part of the fabric of everyday life in the early ‘80s. But we can recognize it as one in a predictable series of overheated reactions to new media that go back all the way to the invention of writing (which ancients thought would spell the end of memory). There is a particularly American tradition of becoming enthralled with new technologies of communication, identifying their promise of future prosperity and renewed community. It is matched by a related American tradition of freaking out about the same objects, which are also figured as threats to life as we know it.

The emergence of the railroad and the telegraph in the 19th century and of novel 20th century technologies like the telephone, radio, cinema, television, and the internet were all similarly greeted by a familiar mix of high hopes and dark fears. In Walden, published in 1854, Henry David Thoreau warned that, “we do not ride on the railroad; it rides upon us.” Technologies of both centuries were imagined to unite to unite a vast and dispersed nation and edify citizens, but they also were suspected of trivializing daily affairs, weakening local bonds, and worse yet, exposing vulnerable children to threats and hindering their development into responsible adults.

These expressions are often a species of moral outrage known as media panic, a reaction of adults to the perceived dangers of an emerging culture popular with children, which the parental generation finds unfamiliar and threatening. Media panics recur in a dubious cycle of lathering outrage, with grownups seeming not to realize that the same excessive alarmism has arisen in every generation. Eighteenth and 19th century novels might have caused confusion to young women about the difference between fantasy and reality, and excited their passions too much. In the 1950s, rock and roll was “the devil’s music,” feared for inspiring lust and youthful rebellion, and encouraging racial mixing. Dime novels, comic books, and camera phones have all been objects of frenzied worry about “the kids these days.”

The popularity of video games in the ‘80s prompted educators, psychotherapists, local government officeholders, and media commentators to warn that young players were likely to suffer serious negative effects. The games would influence their aficionados in the all the wrong ways. They would harm children’s eyes and might cause “Space Invaders Wrist” and other physical ailments. Like television, they would be addictive, like a drug. Games would inculcate violence and aggression in impressionable youngsters. Their players would do badly in school and become isolated and desensitized. A reader wrote to The New York Times to complain that video games were “cultivating a generation of mindless, ill-tempered adolescents.”

The arcades where many teenagers preferred to play video games were imagined as dens of vice, of illicit trade in drugs and sex. Kids who went to play Tempest or Donkey Kong might end up seduced by the lowlifes assumed to hang out in arcades, spiraling into lives of substance abuse, sexual depravity, and crime. Children hooked on video games might steal to feed their habit. Reports at the time claimed that video kids had vandalized cigarette machines, pocketing the quarters and leaving behind the nickels and dimes. […]

Somehow, a generation of teenagers from the 1980s managed to grow up despite the dangers, real or imagined, from video games. The new technology could not have been as powerful as its detractors or its champions imagined. It’s easy to be captivated by novelty, but it can force us to miss the cyclical nature of youth media obsessions. Every generation fastens onto something that its parents find strange, whether Elvis or Atari. In every moment in media history, intergenerational tension accompanies the emergence of new forms of culture and communication. Now we have sexting, cyberbullying, and smartphone addiction to panic about.

But while the gadgets keep changing, our ideas about youth and technology, and our concerns about young people’s development in an uncertain and ever-changing modern world, endure.

Why calling screen time ‘digital heroin’ is digital garbage
by Rachel Becker

The supposed danger of digital media made headlines over the weekend when psychotherapist Nicholas Kardaras published a story in the New York Post called “It’s ‘digital heroin’: How screens turn kids into psychotic junkies.” In the op-ed, Kardaras claims that “iPads, smartphones and XBoxes are a form of digital drug.” He stokes fears about the potential for addiction and the ubiquity of technology by referencing “hundreds of clinical studies” that show “screens increase depression, anxiety and aggression.”
We’ve seen this form of scaremongering before. People are frequently uneasy with new technology, after all. The problem is, screens and computers aren’t actually all that new. There’s already a whole generation — millennials — who grew up with computers. They appear, mostly, to be fine, selfies aside. If computers were “digital drugs,” wouldn’t we have already seen warning signs?

No matter. Kardaras opens with a little boy who was so hooked on Minecraft that his mom found him in his room in the middle of the night, in a “catatonic stupor” — his iPad lying next to him. This is an astonishing use of “catatonic,” and is almost certainly not medically correct. It’s meant to scare parents.

by Alison Gopnik

My own childhood was dominated by a powerful device that used an optical interface to transport the user to an alternate reality. I spent most of my waking hours in its grip, oblivious of the world around me. The device was, of course, the book. Over time, reading hijacked my brain, as large areas once dedicated to processing the “real” world adapted to processing the printed word. As far as I can tell, this early immersion didn’t hamper my development, but it did leave me with some illusions—my idea of romantic love surely came from novels.
English children’s books, in particular, are full of tantalizing food descriptions. At some point in my childhood, I must have read about a honeycomb tea. Augie, enchanted, agreed to accompany me to the grocery store. We returned with a jar of honeycomb, only to find that it was an inedible, waxy mess.

Many parents worry that “screen time” will impair children’s development, but recent research suggests that most of the common fears about children and screens are unfounded. (There is one exception: looking at screens that emit blue light before bed really does disrupt sleep, in people of all ages.) The American Academy of Pediatrics used to recommend strict restrictions on screen exposure. Last year, the organization examined the relevant science more thoroughly, and, as a result, changed its recommendations. The new guidelines emphasize that what matters is content and context, what children watch and with whom. Each child, after all, will have some hundred thousand hours of conscious experience before turning sixteen. Those hours can be like the marvellous ones that Augie and I spent together bee-watching, or they can be violent or mindless—and that’s true whether those hours are occupied by apps or TV or books or just by talk.

New tools have always led to panicky speculation. Socrates thought that reading and writing would have disastrous effects on memory; the novel, the telegraph, the telephone, and the television were all declared to be the End of Civilization as We Know It, particularly in the hands of the young. Part of the reason may be that adult brains require a lot of focus and effort to learn something new, while children’s brains are designed to master new environments spontaneously. Innovative technologies always seem distracting and disturbing to the adults attempting to master them, and transparent and obvious—not really technology at all—to those, like Augie, who encounter them as children.

The misguided moral panic over Slender Man
by Adam Possamai

Sociologists argue that rather than simply being created stories, urban legends represent the fear and anxieties of current time, and in this instance, the internet culture is offering a global and a more participatory platform in the story creation process.

New technology is also allowing urban legends to be transmitted at a faster pace than before the invention of the printing press, and giving more people the opportunity to shape folk stories that blur the line between fiction and reality. Commonly, these stories take a life of their own and become completely independent from what the original creator wanted to achieve.

Yet if we were to listen to social commentary this change in the story creation process is opening the door to deviant acts.

Last century, people were already anxious about children accessing VHS and Betamax tapes and being exposed to violence and immorality. We are now likely to face a similar moral panic with regards to the internet.

Sleepwalking Through Our Dreams

In The Secret Life of Puppets, Victoria Nelson makes some useful observations of reading addiction, specifically in terms of formulaic genres. She discusses Sigmund Freud’s repetition compulsion and Lenore Terr’s post-traumatic games. She sees genre reading as a ritual-like enactment that can’t lead to resolution, and so the addictive behavior becomes entrenched. This would apply to many other forms of entertainment and consumption. And it fits into Derrick Jensen’s discussion of abuse, trauma, and the victimization cycle.

I would broaden her argument in another way. People have feared the written text ever since it was invented. In the 18th century, there took hold a moral panic about reading addiction in general and that was before any fiction genres had developed (Frank Furedi, The Media’s First Moral Panic; full text available at Wayback Machine). The written word is unchanging and so creates the conditions for repetition compulsion. Every time a text is read, it is the exact same text.

That is far different from oral societies. And it is quite telling that oral societies have a much more fluid sense of self. The Piraha, for example, don’t cling to their sense of self nor that of others. When a Piraha individual is possessed by a spirit or meets a spirit who gives them a new name, the self that was there is no longer there. When asked where is that person, the Piraha will say that he or she isn’t there, even if the same body of the individual is standing right there in front of them. They also don’t have a storytelling tradition or concern for the past.

Another thing that the Piraha apparently lack is mental illness, specifically depression along with suicidal tendencies. According to Barbara Ehrenreich from Dancing in the Streets, there wasn’t much written about depression even in the Western world until the suppression of religious and public festivities, such as Carnival. One of the most important aspects of Carnival and similar festivities was the masking, shifting, and reversal of social identities. Along with this, there was the losing of individuality within the group. And during the Middle Ages, an amazing number of days in the year were dedicated to communal celebrations. The ending of this era coincided with numerous societal changes, including the increase of literacy with the spread of the movable type printing press.

The Media’s First Moral Panic
by Frank Furedi

When cultural commentators lament the decline of the habit of reading books, it is difficult to imagine that back in the 18th century many prominent voices were concerned about the threat posed by people reading too much. A dangerous disease appeared to afflict the young, which some diagnosed as reading addiction and others as reading rage, reading fever, reading mania or reading lust. Throughout Europe reports circulated about the outbreak of what was described as an epidemic of reading. The behaviours associated with this supposedly insidious contagion were sensation-seeking and morally dissolute and promiscuous behaviour. Even acts of self-destruction were associated with this new craze for the reading of novels.

What some described as a craze was actually a rise in the 18th century of an ideal: the ‘love of reading’. The emergence of this new phenomenon was largely due to the growing popularity of a new literary genre: the novel. The emergence of commercial publishing in the 18th century and the growth of an ever-widening constituency of readers was not welcomed by everyone. Many cultural commentators were apprehensive about the impact of this new medium on individual behaviour and on society’s moral order.

With the growing popularity of novel reading, the age of the mass media had arrived. Novels such as Samuel Richardson’s Pamela, or Virtue Rewarded (1740) and Rousseau’s Julie, or the New Heloise (1761) became literary sensations that gripped the imagination of their European readers. What was described as ‘Pamela-fever’ indicated the powerful influence novels could exercise on the imagination of the reading public. Public deliberation on these ‘fevers’ focused on what was a potentially dangerous development, which was the forging of an intense and intimate interaction between the reader and literary characters. The consensus that emerged was that unrestrained exposure to fiction led readers to lose touch with reality and identify with the novel’s romantic characters to the point of adopting their behaviour. The passionate enthusiasm with which European youth responded to the publication of Johann Wolfgang von Goethe’s novel The Sorrows of Young Werther (1774) appeared to confirm this consensus. […]

What our exploration of the narrative of Werther fever suggests is that it acquired a life of its own to the point that it mutated into a taken-for-granted rhetorical idiom, which accounted for the moral problems facing society. Warnings about an epidemic of suicide said more about the anxieties of their authors than the behaviour of the readers of the novels. An inspection of the literature circulating these warnings indicates a striking absence of empirical evidence. The constant allusion to Miss. G., to nameless victims and to similarly framed death scenes suggests that these reports had little factual content to draw on. Stories about an epidemic of suicide were as fictional as the demise of Werther in Goethe’s novel.

It is, however, likely that readers of Werther were influenced by the controversy surrounding the novel. Goethe himself was affected by it and in his autobiography lamented that so many of his readers felt called upon to ‘re-enact the novel, and possibly shoot themselves’. Yet, despite the sanctimonious scaremongering, it continued to attract a large readership. While there is no evidence that Werther was responsible for the promotion of a wave of copycat suicides, it evidently succeeded in inspiring a generation of young readers. The emergence of what today would be described as a cult of fans with some of the trappings of a youth subculture is testimony to the novel’s powerful appeal.

The association of the novel with the disorganisation of the moral order represented an early example of a media panic. The formidable, sensational and often improbable effects attributed to the consequences of reading in the 18th century provided the cultural resources on which subsequent reactions to the cinema, television or the Internet would draw on. In that sense Werther fever anticipated the media panics of the future.

Curiously, the passage of time has not entirely undermined the association of Werther fever with an epidemic of suicide. In 1974 the American sociologist Dave Phillips coined the term, the ‘Werther Effect’ to describe mediastimulated imitation of suicidal behaviour. But the durability of the Werther myth notwithstanding, contemporary media panics are rarely focused on novels. In the 21st century the simplistic cause and effect model of the ‘Werther Effect is more likely to be expressed through moral anxieties about the danger of cybersuicide, copycat online suicide.

The Better Angels of Our Nature
by Steven Pinker
Kindle Locations 13125-13143
(see To Imagine and Understand)

It would be surprising if fictional experiences didn’t have similar effects to real ones, because people often blur the two in their memories. 65 And a few experiments do suggest that fiction can expand sympathy. One of Batson’s radio-show experiments included an interview with a heroin addict who the students had been told was either a real person or an actor. 66 The listeners who were asked to take his point of view became more sympathetic to heroin addicts in general, even when the speaker was fictitious (though the increase was greater when they thought he was real). And in the hands of a skilled narrator, a fictitious victim can elicit even more sympathy than a real one. In his book The Moral Laboratory, the literary scholar Jèmeljan Hakemulder reports experiments in which participants read similar facts about the plight of Algerian women through the eyes of the protagonist in Malike Mokkeddem’s novel The Displaced or from Jan Goodwin’s nonfiction exposé Price of Honor. 67 The participants who read the novel became more sympathetic to Algerian women than those who read the true-life account; they were less likely, for example, to blow off the women’s predicament as a part of their cultural and religious heritage. These experiments give us some reason to believe that the chronology of the Humanitarian Revolution, in which popular novels preceded historical reform, may not have been entirely coincidental: exercises in perspective-taking do help to expand people’s circle of sympathy.

The science of empathy has shown that sympathy can promote genuine altruism, and that it can be extended to new classes of people when a beholder takes the perspective of a member of that class, even a fictitious one. The research gives teeth to the speculation that humanitarian reforms are driven in part by an enhanced sensitivity to the experiences of living things and a genuine desire to relieve their suffering. And as such, the cognitive process of perspective-taking and the emotion of sympathy must figure in the explanation for many historical reductions in violence. They include institutionalized violence such as cruel punishments, slavery, and frivolous executions; the everyday abuse of vulnerable populations such as women, children, homosexuals, racial minorities, and animals; and the waging of wars, conquests, and ethnic cleansings with a callousness to their human costs.

Innocent Weapons:
The Soviet and American Politics of Childhood in the Cold War

by Margaret E. Peacock
pp. 88-89

As a part of their concern over American materialism, politicians and members of the American public turned their attention to the rising influence of media and popular culture upon the next generation.69 Concerns over uncontrolled media were not new in the United States in the 1950s. They had a way of erupting whenever popular culture underwent changes that seemed to differentiate the generations. This was the case during the silent film craze of the 1920s and when the popularity of dime novels took off in the 1930s.70 Yet, for many in the postwar era, the press, the radio, and the television presented threats to children that the country had never seen before. As members of Congress from across the political spectrum would argue throughout the 1950s, the media had the potential to present a negative image of the United States abroad, and it ran the risk of corrupting the minds of the young at a time when shoring up national patriotism and maintaining domestic order were more important than ever. The impact of media on children was the subject of Fredric Wertham’s 1953 best-selling book Seduction of the Innocent, in which he chronicled his efforts over the course of three years to “trace some of the roots of the modern mass delinquency.”71 Wertham’s sensationalist book documented case after case of child delinquents who seemed to be mimicking actions that they had seen on the television or, in particular, in comic strips. Horror comics, which were popular from 1948 until 1954, showed images of children killing their parents and peers, sometimes in gruesome ways—framing them for murder—being cunning and devious, even cannibalistic. A commonly cited story was that of “Bloody Mary,” published by Farrell Comics, which told the story of a seven-year-old girl who strangles her mother, sends her father to the electric chair for the murder, and then kills a psychiatrist who has learned that the girl committed these murders and that she is actually a dwarf in disguise.72 Wertham’s crusade against horror comics was quickly joined by two Senate subcommittees in 1954, at the heads of which sat Estes Kefauver and Robert Hendrickson. They argued to their colleagues that the violence and destruction of the family in these comic books symbolized “a terrible twilight zone between sanity and madness.”73 They contended that children found in these comic books violent models of behavior and that they would otherwise be law abiding. J. Edgar Hoover chimed in to comment that “a comic which makes lawlessness attractive . . . may influence the susceptible boy or girl.”74

Such depictions carried two layers of threat. First, as Wertham, Hoover, and Kefauver argued, they reflected the seeming potential of modern media to transform “average” children into delinquents.75 Alex Drier, popular NBC newscaster, argued in May 1954 that “this continuous flow of filth [is] so corruptive in its effects that it has actually obliterated decent instincts in many of our children.”76 Yet perhaps more telling, the comics, as well as the heated response that they elicited, also reflected larger anxieties about what identities children should assume in contemporary America. As in the case of Bloody Mary, these comics presented an image of apparently sweet youths who were in fact driven by violent impulses and were not children at all. “How can we expose our children to this and then expect them to run the country when we are gone?” an agitated Hendrickson asked his colleagues in 1954.77 Bloody Mary, like the uneducated dolts of the Litchfield report and the spoiled boys of Wylie’s conjuring, presented an alternative identity for American youth that seemed to embody a new and dangerous future.

In the early months of 1954, Robert Hendrickson argued to his colleagues that “the strained international and domestic situation makes it impossible for young people of today to look forward with certainty to higher education, to entering a trade or business, to plans for marriage, a home, and family. . . . Neither the media, nor modern consumerism, nor the threat from outside our borders creates a problem child. But they do add to insecurity, to loneliness, to fear.”78 For Hendrickson these domestic trends, along with what he called “deficient adults,” seemed to have created a new population of troubled and victimized children who were “beyond the pale of our society.”79

The End of Victory Culture:
Cold War America and the Disillusioning of a Generation

by Tom Engelhardt
Kindle Locations 2872-2910

WORRY, BORDERING ON HYSTERIA, about the endangering behaviors of “youth” has had a long history in America, as has the desire of reformers and censors to save “innocent” children from the polluting effects of commercial culture. At the turn of the century, when middle-class white adolescents first began to take their place as leisure-time trendsetters, fears arose that the syncopated beat of popular “coon songs” and ragtime music would demonically possess young listeners, who might succumb to the “evils of the Negro soul.” Similarly, on-screen images of crime, sensuality, and violence in the earliest movies, showing in “nickel houses” run by a “horde of foreigners,” were decried by reformers. They were not just “unfit for children’s eyes,” but a “disease” especially virulent to young (and poor) Americans, who were assumed to lack all immunity to such spectacles. 1 […]

To many adults, a teen culture beyond parental oversight had a remarkably alien look to it. In venues ranging from the press to Senate committees, from the American Psychiatric Association to American Legion meetings, sensational and cartoonlike horror stories about the young or the cultural products they were absorbing were told. Tabloid newspaper headlines reflected this: “Two Teen Thrill Killings Climax City Park Orgies. Teen Age Killers Pose a Mystery— Why Did They Do It?… 22 Juveniles Held in Gang War. Teen Age Mob Rips up BMT Train. Congressmen Stoned, Cops Hunt Teen Gang.” After a visit to the movies in 1957 to watch two “teenpics,” Rock All Night and Dragstrip Girl, Ruth Thomas of Newport, Rhode Island’s Citizen’s Committee on Literature expressed her shock in words at least as lurid as those of any tabloid: “Isn’t it a form of brain-washing? Brain-washing the minds of the people and especially the youth of our nation in filth and sadistic violence. What enemy technique could better lower patriotism and national morale than the constant presentation of crime and horror both as news and recreation.” 3

You did not have to be a censor, a right-wing anti-Communist, or a member of the Catholic Church’s Legion of Decency, however, to hold such views. Dr. Frederick Wertham, a liberal psychiatrist, who testified in the landmark Brown v. Board of Education desegregation case and set up one of the first psychiatric clinics in Harlem, publicized the idea that children viewing commercially produced acts of violence and depravity, particularly in comic books, could be transformed into little monsters. The lurid title of his best-selling book, Seduction of the Innocent, an assault on comic books as “primers for crime,” told it all. In it, Dr. Wertham offered copious “horror stories” that read like material from Tales from the Crypt: “Three boys, six to eight years old, took a boy of seven, hanged him nude from a tree, his hands tied behind him, then burned him with matches. Probation officers investigating found that they were re-enacting a comic-book plot.… A boy of thirteen committed a lust murder of a girl of six. After his arrest, in jail, he asked for comicbooks” 4

Kindle Locations 2927-2937

The two— hood and performer, lower-class white and taboo black— merged in the “pelvis” of a Southern “greaser” who dressed like a delinquent, used “one of black America’s favorite products, Royal Crown Pomade hair grease” (meant to give hair a “whiter” look), and proceeded to move and sing “like a negro.” Whether it was because they saw a white youth in blackface or a black youth in whiteface, much of the media grew apoplectic and many white parents alarmed. In the meantime, swiveling his hips and playing suggestively with the microphone, Elvis Presley broke into the lives of millions of teens in 1956, bringing with him an element of disorder and sexuality associated with darkness. 6†

The second set of postwar fears involved the “freedom” of the commercial media— record and comic book companies, radio stations, the movies, and television— to concretize both the fantasies of the young and the nightmarish fears of grown-ups into potent products. For many adults, this was abundance as betrayal, the good life not as a vision of Eden but as an unexpected horror story.

Kindle Locations 2952-2979

Take comic books. Even before the end of World War II, a new kind of content was creeping into them as they became the reading matter of choice for the soldier-adolescent. […] Within a few years, “crime” comics like Crime Does Not Pay emerged from the shadows, displaying a wide variety of criminal acts for the delectation of young readers. These were followed by horror and science fiction comics, purchased in enormous numbers. By 1953, more than 150 horror comics were being produced monthly, featuring acts of torture often of an implicitly sexual nature, murders and decapitations of various bloody sorts, visions of rotting flesh, and so on. 9

Miniature catalogs of atrocities, their feel was distinctly assaultive. In their particular version of the spectacle of slaughter, they targeted the American family, the good life, and revered institutions. Framed by sardonic detective narrators or mocking Grand Guignol gatekeepers, their impact was deconstructive. Driven by a commercial “hysteria” as they competed to attract buyers with increasingly atrocity-ridden covers and stories, they both partook of and mocked the hysteria about them. Unlike radio or television producers, the small publishers of the comic book business were neither advertiser driven nor corporately controlled.

Unlike the movies, comics were subject to no code. Unlike the television networks, comics companies had no Standards and Practices departments. No censoring presence stood between them and whoever would hand over a dime at a local newsstand. Their penny-ante ads and pathetic pay scale ensured that writing and illustrating them would be a job for young men in their twenties (or even teens). Other than early rock and roll, comics were the only cultural form of the period largely created by the young for those only slightly younger. In them, uncensored, can be detected the dismantling voice of a generation that had seen in the world war horrors beyond measure.

The hysterical tone of the response to these comics was remarkable. Comics publishers were denounced for conspiring to create a delinquent nation. Across the country, there were publicized comic book burnings like one in Binghamton, New York, where 500 students were dismissed from school early in order to torch 2,000 comics and magazines. Municipalities passed ordinances prohibiting the sale of comics, and thirteen states passed legislation to control their publication, distribution, or sale. Newspapers and magazines attacked the comics industry. The Hartford Courant decried “the filthy stream that flows from the gold-plated sewers of New York.” In April 1954, the Senate Subcommittee to Investigate Juvenile Delinquency convened in New York to look into links between comics and teen crime. 10

Kindle Locations 3209-3238

If sponsors and programmers recognized the child as an independent taste center, the sight of children glued to the TV, reveling in their own private communion with the promise of America, proved unsettling to some adults. The struggle to control the set, the seemingly trancelike quality of TV time, the soaring number of hours spent watching, could leave a parent feeling challenged by some hard-to-define force released into the home under the aegis of abundance, and the watching child could gain the look of possession, emptiness, or zombification.

Fears of TV’s deleterious effects on the child were soon widespread. The medical community even discovered appropriate new childhood illnesses. There was “TV squint” or eyestrain, “TV bottom,” “bad feet” (from TV-induced inactivity), “frogitis” (from a viewing position that put too much strain on inner-leg ligaments), “TV tummy” (from TV-induced overexcitement), “TV jaw” or “television malocclusion” (from watching while resting on one’s knuckles, said to force the eyeteeth inward), and “tired child syndrome” (chronic fatigue, loss of appetite, headaches, and vomiting induced by excessive viewing).

However, television’s threat to the child was more commonly imagined to lie in the “violence” of its programming. Access to this “violence” and the sheer number of hours spent in front of the set made the idea that this new invention was acting in loco parentis seem chilling to some; and it was true that via westerns, crime shows, war and spy dramas, and Cold War-inspired cartoons TV was indiscriminately mixing a tamed version of the war story with invasive Cold War fears. Now, children could endlessly experience the thrill of being behind the barrel of a gun. Whether through the Atom Squad’s three government agents, Captain Midnight and his Secret Squadron, various FBI men, cowboys, or detectives, they could also encounter “an array of H-bomb scares, mad Red scientists, [and] plots to rule the world,” as well as an increasing level of murder and mayhem that extended from the six-gun frontier of the “adult” western to the blazing machine guns of the crime show. 30

Critics, educators, and worried parents soon began compiling TV body counts as if the statistics of victory were being turned on young Americans. “Frank Orme, an independent TV watchdog, made a study of Los Angeles television in 1952 and noted, in one week, 167 murders, 112 justifiable homicides, and 356 attempted murders. Two-thirds of all the violence he found occurred in children’s shows. In 1954, Orme said violence on kids’ shows had increased 400 percent since he made his first report.” PTAs organized against TV violence, and Senate hearings searched for links between TV programming and juvenile delinquency.

Such “violence,” though, was popular. In addition, competition for audiences among the three networks had the effect of ratcheting up the pressures for violence, just as it had among the producers of horror comics. At The Untouchables, a 1960 hit series in which Treasury agent Eliot Ness took on Chicago’s gangland (and weekly reached 5-8 million young viewers), ABC executives would push hard for more “action.” Producer Quinn Martin would then demand the same of his subordinates, “or we are all going to get clobbered.” In a memo to one of the show’s writers, he asked: “I wish you would come up with a different device than running the man down with a car, as we have done this now in three different shows. I like the idea of sadism, but I hope we can come up with another approach to it.” 31

Paradoxes of State and Civilization Narratives

Below is a passage from a recent book by James C. Scott, Against the Grain.

The book is about agriculture, sedentism, and early statism. The author questions the standard narrative. In doing so, he looks more closely at what the evidence actually shows us about civilization, specifically in terms of supposed collapses and dark ages (elsewhere in the book, he also discusses how non-state ‘barbarians’ are connected to, influenced by, and defined according to states).

Oddly, Scott never mentions Göbekli Tepe. It is an ancient archaeological site that offers intriguing evidence of civilization preceding and hence not requiring agriculture, sedentism, or statism. As has been said of it, “First came the temple, then the city.” That would seem to fit into the book’s framework.

The other topic not mentioned, less surprisingly, is Julian Jaynes’ theory of bicameralism. Jaynes’ view might complicate Scott’s interpretations. Scott goes into great detail about domestication and slavery, specifically in the archaic civilizations such as first seen with the walled city-states. But Jaynes pointed out that authoritarianism as we know it didn’t seem to exist early on, as the bicameral mind made social conformity possible through non-individualistic identity and collective experience (explained in terms of the hypothesis of archaic authorization).

Scott’s focus is more on external factors. From perusing the book, he doesn’t seem to fully take into account social science research, cultural studies, anthropology, philology, etc. The thesis of the book could have been further developed by exploring other areas, although maybe the narrow focus is useful for emphasizing the central point about agriculture. There is a deeper issue, though, that the author does touch upon. What does it mean to be a domesticated human? After all, that is what civilization is about.

He does offer an interesting take on human domestication. Basically, he doesn’t see that most humans ever take the yoke of civilization willingly. There must be systems of force and control in place to make people submit. I might agree, even as I’m not sure that this is the central issue. It’s less about how people submit in body than how they submit in mind. Whether or not we are sheep, there is no shepherd. Even the rulers of the state are sheep.

The temple comes first. Before civilization proper, before walled city-states, before large-scale settlement, before agriculture, before even pottery, there was a temple. What does the temple represent?

* * *

Against the Grain
by James C. Scott
pp. 22-27

PARADOXES OF STATE AND CIVILIZATION NARRATIVES

A foundational question underlying state formation is how we ( Homo sapiens sapiens ) came to live amid the unprecedented concentrations of domesticated plants, animals, and people that characterize states. From this wide-angle view, the state form is anything but natural or given. Homo sapiens appeared as a subspecies about 200,000 years ago and is found outside of Africa and the Levant no more than 60,000 years ago. The first evidence of cultivated plants and of sedentary communities appears roughly 12,000 years ago. Until then—that is to say for ninety-five percent of the human experience on earth—we lived in small, mobile, dispersed, relatively egalitarian, hunting-and-gathering bands. Still more remarkable, for those interested in the state form, is the fact that the very first small, stratified, tax-collecting, walled states pop up in the Tigris and Euphrates Valley only around 3,100 BCE, more than four millennia after the first crop domestications and sedentism. This massive lag is a problem for those theorists who would naturalize the state form and assume that once crops and sedentism, the technological and demographic requirements, respectively, for state formation were established, states/empires would immediately arise as the logical and most efficient units of political order. 4

These raw facts trouble the version of human prehistory that most of us (I include myself here) have unreflectively inherited. Historical humankind has been mesmerized by the narrative of progress and civilization as codified by the first great agrarian kingdoms. As new and powerful societies, they were determined to distinguish themselves as sharply as possible from the populations from which they sprang and that still beckoned and threatened at their fringes. In its essentials, it was an “ascent of man” story. Agriculture, it held, replaced the savage, wild, primitive, lawless, and violent world of hunter-gatherers and nomads. Fixed-field crops, on the other hand, were the origin and guarantor of the settled life, of formal religion, of society, and of government by laws. Those who refused to take up agriculture did so out of ignorance or a refusal to adapt. In virtually all early agricultural settings the superiority of farming was underwritten by an elaborate mythology recounting how a powerful god or goddess entrusted the sacred grain to a chosen people.

Once the basic assumption of the superiority and attraction of fixed-field farming over all previous forms of subsistence is questioned, it becomes clear that this assumption itself rests on a deeper and more embedded assumption that is virtually never questioned. And that assumption is that sedentary life itself is superior to and more attractive than mobile forms of subsistence. The place of the domus and of fixed residence in the civilizational narrative is so deep as to be invisible; fish don’t talk about water! It is simply assumed that weary Homo sapiens couldn’t wait to finally settle down permanently, could not wait to end hundreds of millennia of mobility and seasonal movement. Yet there is massive evidence of determined resistance by mobile peoples everywhere to permanent settlement, even under relatively favorable circumstances. Pastoralists and hunting-and-gathering populations have fought against permanent settlement, associating it, often correctly, with disease and state control. Many Native American peoples were confined to reservations only on the heels of military defeat. Others seized historic opportunities presented by European contact to increase their mobility, the Sioux and Comanche becoming horseback hunters, traders, and raiders, and the Navajo becoming sheep-based pastoralists. Most peoples practicing mobile forms of subsistence—herding, foraging, hunting, marine collecting, and even shifting cultivation—while adapting to modern trade with alacrity, have bitterly fought permanent settlement. At the very least, we have no warrant at all for supposing that the sedentary “givens” of modern life can be read back into human history as a universal aspiration. 5

The basic narrative of sedentism and agriculture has long survived the mythology that originally supplied its charter. From Thomas Hobbes to John Locke to Giambattista Vico to Lewis Henry Morgan to Friedrich Engels to Herbert Spencer to Oswald Spengler to social Darwinist accounts of social evolution in general, the sequence of progress from hunting and gathering to nomadism to agriculture (and from band to village to town to city) was settled doctrine. Such views nearly mimicked Julius Caesar’s evolutionary scheme from households to kindreds to tribes to peoples to the state (a people living under laws), wherein Rome was the apex, with the Celts and then the Germans ranged behind. Though they vary in details, such accounts record the march of civilization conveyed by most pedagogical routines and imprinted on the brains of schoolgirls and schoolboys throughout the world. The move from one mode of subsistence to the next is seen as sharp and definitive. No one, once shown the techniques of agriculture, would dream of remaining a nomad or forager. Each step is presumed to represent an epoch-making leap in mankind’s well-being: more leisure, better nutrition, longer life expectancy, and, at long last, a settled life that promoted the household arts and the development of civilization. Dislodging this narrative from the world’s imagination is well nigh impossible; the twelve-step recovery program required to accomplish that beggars the imagination. I nevertheless make a small start here.

It turns out that the greater part of what we might call the standard narrative has had to be abandoned once confronted with accumulating archaeological evidence. Contrary to earlier assumptions, hunters and gatherers—even today in the marginal refugia they inhabit—are nothing like the famished, one-day-away-from-starvation desperados of folklore. Hunters and gathers have, in fact, never looked so good—in terms of their diet, their health, and their leisure. Agriculturalists, on the contrary, have never looked so bad—in terms of their diet, their health, and their leisure. 6 The current fad of “Paleolithic” diets reflects the seepage of this archaeological knowledge into the popular culture. The shift from hunting and foraging to agriculture—a shift that was slow, halting, reversible, and sometimes incomplete—carried at least as many costs as benefits. Thus while the planting of crops has seemed, in the standard narrative, a crucial step toward a utopian present, it cannot have looked that way to those who first experienced it: a fact some scholars see reflected in the biblical story of Adam and Eve’s expulsion from the Garden of Eden.

The wounds the standard narrative has suffered at the hands of recent research are, I believe, life threatening. For example, it has been assumed that fixed residence—sedentism—was a consequence of crop-field agriculture. Crops allowed populations to concentrate and settle, providing a necessary condition for state formation. Inconveniently for the narrative, sedentism is actually quite common in ecologically rich and varied, preagricultural settings—especially wetlands bordering the seasonal migration routes of fish, birds, and larger game. There, in ancient southern Mesopotamia (Greek for “between the rivers”), one encounters sedentary populations, even towns, of up to five thousand inhabitants with little or no agriculture. The opposite anomaly is also encountered: crop planting associated with mobility and dispersal except for a brief harvest period. This last paradox alerts us again to the fact that the implicit assumption of the standard narrative—namely that people couldn’t wait to abandon mobility altogether and “settle down”—may also be mistaken.

Perhaps most troubling of all, the civilizational act at the center of the entire narrative: domestication turns out to be stubbornly elusive. Hominids have, after all, been shaping the plant world—largely with fire—since before Homo sapiens. What counts as the Rubicon of domestication? Is it tending wild plants, weeding them, moving them to a new spot, broadcasting a handful of seeds on rich silt, depositing a seed or two in a depression made with a dibble stick, or ploughing? There appears to be no “aha!” or “Edison light bulb” moment. There are, even today, large stands of wild wheat in Anatolia from which, as Jack Harlan famously showed, one could gather enough grain with a flint sickle in three weeks to feed a family for a year. Long before the deliberate planting of seeds in ploughed fields, foragers had developed all the harvest tools, winnowing baskets, grindstones, and mortars and pestles to process wild grains and pulses. 7 For the layman, dropping seeds in a prepared trench or hole seems decisive. Does discarding the stones of an edible fruit into a patch of waste vegetable compost near one’s camp, knowing that many will sprout and thrive, count?

For archaeo-botanists, evidence of domesticated grains depended on finding grains with nonbrittle rachis (favored intentionally and unintentionally by early planters because the seedheads did not shatter but “waited for the harvester”) and larger seeds. It now turns out that these morphological changes seem to have occurred well after grain crops had been cultivated. What had appeared previously to be unambiguous skeletal evidence of fully domesticated sheep and goats has also been called into question. The result of these ambiguities is twofold. First, it makes the identification of a single domestication event both arbitrary and pointless. Second, it reinforces the case for a very, very long period of what some have called “low-level food production” of plants not entirely wild and yet not fully domesticated either. The best analyses of plant domestication abolish the notion of a singular domestication event and instead argue, on the basis of strong genetic and archaeological evidence, for processes of cultivation lasting up to three millennia in many areas and leading to multiple, scattered domestications of most major crops (wheat, barley, rice, chick peas, lentils). 8

While these archaeological findings leave the standard civilizational narrative in shreds, one can perhaps see this early period as part of a long process, still continuing, in which we humans have intervened to gain more control over the reproductive functions of the plants and animals that interest us. We selectively breed, protect, and exploit them. One might arguably extend this argument to the early agrarian states and their patriarchal control over the reproduction of women, captives, and slaves. Guillermo Algaze puts the matter even more boldly: “Early Near Eastern villages domesticated plants and animals. Uruk urban institutions, in turn, domesticated humans.” 9

Human Condition

Human nature and the human condition
by The Philosopher’s Beard Blog

“The distinction between human nature and the human condition has implications that go beyond whether some academic sub-fields are built on fundamental error and thus a waste of time (hardly news). The foundational mistake of assuming that certain features prominent among contemporary human beings are true of H. sapiens and therefore true of all of us has implications for how we think about ourselves now. There is a lack of adequate critical reflection – of a true scientific spirit of inquiry – in much of the naturalising project. It fits all too easily with our natural desire for a convenient truth: that the way the world seems is the way it has to be.

“For example, many people believe that to be human is to be religious – or at least to have a ‘hunger for religion’ – and argue as a result that religion should be accorded special prominence and autonomy in our societies – in our education, civil, and political institutions. American ‘secularism’ for example might be said to be built on this principle: hence all religions are engaged in a similar project of searching for the divine and deserve equal respect. The pernicious implication is that the non-religious (who are not the same as atheists, by the way) are somehow lacking in an essential human capability, and should be pitied or perhaps given help to overcome the gaping hole in their lives.

“Anatomically modern humans have been around in our current form for around 200,000 years but while our physiological capacities have scarcely changed we are cognitively very different. Human beings operate in a human world of our own creation, as well as in the natural, biological world that we are given. In the human world people create new inventions – like religion or war or slavery – that do something for them. Those inventions succeed and spread in so far as they are amenable to our human nature and our other inventions, and by their success they condition us to accept the world they create until it seems like it could not have been otherwise.

“Recognising the fact that the human condition is human-made offers us the possibility to scrutinise it, to reflect, and perhaps even to adopt better inventions. Slavery was once so dominant in our human world that even Aristotle felt obliged to give an account of its naturalness (some people are just naturally slavish). But we discovered a better invention – market economies – that has made inefficient slavery obsolete and now almost extinct (which is not to say that this invention is perfect either). The human condition concerns humans as we are, but not as we have to be.”

The Final Rhapsody of Charles Bowden
A visit with the famed journalist just before his death.
by Scott Carrier, Mother Jones Magazine

“Postscript from Bowden’s Blood Orchid, 1995: Imagine the problem is not physical. Imagine the problem has never been physical, that it is not biodiversity, it is not the ozone layer, it is not the greenhouse effect, the whales, the old-growth forest, the loss of jobs, the crack in the ghetto, the abortions, the tongue in the mouth, the diseases stalking everywhere as love goes on unconcerned. Imagine the problem is not some syndrome of our society that can be solved by commissions or laws or a redistribution of what we call wealth. Imagine that it goes deeper, right to the core of what we call our civilization and that no one outside of ourselves can affect real change, that our civilization, our governments are sick and that we are mentally ill and spiritually dead and that all our issues and crises are symptoms of this deeper sickness … then what are we to do?”

Making Gods, Making Individuals

I’ve been reading about bicameralism and the Axial Age. It is all very fascinating.

It’s strange to look back at that era of transformation. The modern sense of self-conscious, introspective, autonomous individuality (as moral agent and rational actor) was just emerging after the breakdown of the bicameral mind. What came before that is almost incomprehensible to us.

One interesting factor is that civilization didn’t create organized religion, but the other way around. Or so it seems, according to the archaeological evidence. When humans were still wandering hunter-gatherers, they began building structures for worship. It was only later that people started settled down around these worship centers. So, humans built permanent houses for the gods before they built permanent houses for themselves.

These God Houses often originated as tombs and burial mounds of revered leaders. The first deities seem to have been god-kings. The leader was considered a god while alive or spoke for god. In either case, death made concrete the deification of the former leader. In doing so, the corpse or some part of it such as the skull would become the worshipped idol. Later on it became more common to carve a statue that allowed for a more long-lasting god who was less prone to decay.

God(s) didn’t make humans. Rather, humans in a very literal sense made god(s). They made the form of the god or used the already available form of a corpse or skull. It was sort of like trapping the dead king’s soul and forcing it to play the role of god.

These bicameral people didn’t make the distinctions we make. There was no clear separation between the divine and the human, between the individual and the group. It was all a singular pre-individuated experience. These ancient humans heard voices, but they had no internal space for their own voice. The voices were heard in the world all around them. The king was or spoke for the high god, and that voice continued speaking even after the king died. We moderns would call that a hallucination, but to them it was just their daily reality.

With the breakdown of the bicameral mind, there was a crisis of community and identity. The entire social order broke down, because of large-scale environmental catastrophes that killed or made into refugees most of the human population back then. In a short period of time, nearly all the great civilizations collapsed in close succession, the collapse of each civilization sending refugees outward in waves of chaos and destruction. Nothing like it was seen before or since in recorded history.

People were desperate to make sense of what happened. But the voices of the gods had grown distant or were silenced. The temples were destroyed, the idols gone, traditions lost, and communities splintered. The bicameral societies had been extremely stable and were utterly dependent on that stability. They couldn’t deal with change at that level. The bicameral mind itself could no longer function. These societies never recovered from this mass tragedy.

An innovation that became useful in this era was improved forms of writing. Using alphabets and scrolls, the ancient oral traditions were written down and altered in the process. Also, new literary traditions increasingly took hold. Epics and canons were formed to bring new order. What formed from this was a sense of the past as different from the present. There was some basic understanding that humanity had changed and that the world used to be different.

A corrolary innovation was that, instead of idol worship, people began to worship these new texts, first as scrolls and then later as books. They found a more portable way of trapping a god. But the loss of the more concrete forms of worship led to the gods becoming more distant. People less often heard the voices of the gods for themselves and instead turned to the texts where it was written the cultural memory of the last people who heard the divine speaking (e.g., Moses) or even the last person who spoke as the divine (e.g., Jesus Christ).

The divine was increasingly brought down to the human level and yet at the same time increasingly made more separate from daily experience. It wasn’t just that the voices of the gods went silent. Rather, the voices that used to be heard externally were being internalized. What once was recognized as divine and as other became the groundwork upon which the individuated self was built. God became a still, small voice and slowly loss its divine quality altogether. People stopped hearing voices of non-human entities. Instead, they developed a thinking mind. The gods became trapped in the human skull and you could say that they forgot they were gods.

The process of making gods eventually transitioned into the process of making individuals. We revere individuality as strongly as people once revered the divine. That is an odd thing.

Society: Precarious or Persistent?

I sometimes think of society as precarious. It can seem easier to destroy something than to create a new thing or to re-create what was lost. It’s natural to take things for granted, until they are gone. Wisdom is learning to appreciate what you have while you have it.

There is value to this perspective, as it expresses the precautionary principle. This includes a wariness about messing with that which we don’t understand… and there is very little in this world we understand as well as maybe we should. We ought to appreciate what we inherit from the generations before us. We don’t know what went into making what we have possible.

Still, I’m not sure this is always the best way to think about it.

Many aspects of society can be as tough to kill as weeds. Use enough harsh chemicals, though, and weeds can be killed, but even then weeds have a way of popping back up. Cultures are like weeds. They persist against amazing odds. We are all living evidence for this being the case, descendants of survivors upon survivors, the products of many millennia of social advance.

In nature, a bare patch of earth rarely remains bare for long, even if doused with weed-killer. You can kill one thing and then something else will take its place. The best way to keep a weed from growing there is to plant other things that make it less hospitable. It’s as much about what a person wants to grow as about what a person doesn’t want to grow.

This is an apt metaphor for the project of imperialism and colonialism. Westerners perceived Africa and the Americas as places of wilderness. They need to be tamed, and that involved farming. The native plants typically were seen as weeds. Europeans couldn’t even recognize some of the agrarian practices of the indigenous for it didn’t fit their idea of farms. They just saw weeds. So, they destroyed what they couldn’t appreciate. As far as they were concerned, it was unused land to be taken and cultivated, which is to say made civilized.

Most of them weren’t going around wantonly destroying everything in sight. They were trying to create something in what to them was a new land and, in the case of the disease impact in the Americas, a seemingly uninhabited land in many cases. Much of the destruction of other societies was incidental from their perspective, although there was plenty of systematic destruction as well. However, my point is that all of this happened in the context of what was seen as “creative destruction”. It was part of a paternalistic project of ‘civilizing’ the world.

In this project, not all was destroyed. Plenty of indigenous people remain in existence and have retained, to varying degrees, their traditional cultures. Still, those who weren’t destroyed had their entire worlds turned upside down.

An example I was thinking about comes from Christine Kenneally’s recent book, The Invisible History of the Human Race by Christine Kenneally.

The areas of Africa where many slaves were taken were originally high functioning societies. They had developed economies and established governments. This meant they had at least basic levels of a culture of trust to make all this possible. It was probably the developed social system and infrastructure that made the slave trade attractive in those places. These Africans were desirable slavves for the very reason that they came from highly developed societies. They had knowledge and skills that the European enslavers lacked.

This where my original thought comes in. From one perspective, it was simply the destruction of a once stable society built on a culture of trust. From another perspective, a new social order was created to take place of the old.

The slave trade obviously created an atmosphere of fear, conflict, and desperation. It eroded trust, turning village against village, neighbor against neighbor, and even families against their own kin. Yet the slave trade was also the foundation of something new, imperialism and colonialism. The agents of this new order didn’t annihilate all of African society. What they did was conquer these societies and then the empires divied up the spoils. In this process, new societies were built on top of the old and so the countries we know today took form.

If these ancient African cultures were genuinely precarious societies, then we would have expected different results. It was the rock-solid substratum that made this transition to colonial rule possible. Even the development of cultures of distrust was a sign of a functioning society in defensive mode. These societies weren’t destroyed. They were defending themselves from destruction under difficult conditions. These societies persisted amidst change by adapting to change.

It is impossible to make a value judgment of this persistence. A culture of distrust may be less than optimal, but it makes perfect sense in these situations. These people have had to fight for their survival. They aren’t going to be taken for fools. Considering the world is still ruled by their former colonizers, they have every right to move forward with trepidation. They would be crazy to do otherwise.

In comparison, I was thinking of societies known for their strong cultures of trust. Those that come to mind are Scandinavia, Germany, and Japan. These societies are also known for their xenophobia. They may have strong trust for insiders, but this is paired with strong distrust of outsiders. So, there is some nuance to what we mean when we speak of cultures of trust. Anyway, it is true that cultures of trust tend to lead to high economic development and wealth. But, as with the examples of Germany and Japan, the xenophobic side of the equation can also lead to mass destruction and violent oppression that impacts people far outside of their national borders.

As for cultures of distrust, they tend to primarily keep their distrust contained within their own boundaries. Few of the former colonies have become empires colonizing other societies. The United States is one of the few exceptions, probably because the native population was so severely decimated and made a minority in their own land. It also should be noted that the U.S. measures fairly high as a culture of trust. I suspect it requires a strong culture of trust to make for an effective empire, and so it oddly may require a culture of trust among the occuppiers in order to create cultures of distrust in the occupied and formerly occupied societies. That is sad to think about.

Cultures tend to persist, even when some people would rather they not. Claiming societies to be precarious, in many cases, could be considered wishful thinking. Social orders must serve one purpose before all others, that is self-perpetuation.

The core of my message here is that we should be as concerned about what we are creating as what we are destroying. The example of Africa is an example of that. A similar example is what happened to the Ottoman Empire. In both cases, they were divided up by the conquering nations and artificial boundaries were created that inevitably led to conflict. This formed the basis for all the problems that have continued in the Middle East and the Arab world extending into North Africa.

That world of conflict didn’t just happened. It was intentionally created. The powers that be wanted the local people to be divided against themselves. It made it easier to rule over them or to otherwise take advantage of them, such as in procuring/stealing their natural resources.

We Americans inherited that colonial mess, as we are part of it. America has never known any other world, for we were born out of the same oppression as the African and Middle Eastern countries. Now, the U.S. has taken the role of the British Empire, the former ruler now made a partner to and subsidiary of American global power. In this role, we assassinate democratically-elected leaders, foment coup d’etats, arm rebel groups, invade and occupy countries, bomb entire regions into oblivion, etc.

The U.S. military can topple a leader like Saddam Hussein and destroy the social order he created that created secular stability, but the U.S. can’t rebuild what it destroyed. Anyway, that isn’t the point. The U.S. never cared about rebuilding anything. It was always about creating something entirely new. Yet the Iraqi people and their society persists, even in a state of turmoil.

The old persists as it is transformed.

What exactly persists in these times of change? Which threads can be traced into the past and which threads will continue to unwind into the future? What is being woven from these threads? What will be inherited by the following generations?

Every Freedom Comes At A Cost of Freedom

We like to think we are a free people in a free society. But our freedoms are social and political freedoms.

They are freedoms of the system we find ourselves in. We didn’t freely choose to be born into the system. We signed no social contract. There is no escape from the system itself. Our freedom is only how we play the game, not whether we play. We didn’t write the rules,  and yet we must obey them.

That is what society is. We do not exist in a state of nature. Natural rights and natural liberty is what someone has who lives alone on a desert island. They are free in a way few people could ever imagine. Most people in that situation, though, would gladly sacrifice the freedom of their isolation for the costs of being a member of society.

Society makes dependents of us all, for good or ill. This is increasingly true as civilization develops.

Even as late as the Great Depression, people still could support themselves through subsistence farming/gardening and living off the land. There was still a lot of land freely open to hunting, fishing, trapping, and gathering. Restrictions were few and the population was smaller.

That is no longer the case.

Those on the right like to complain about all the minorities and poor who are dependent on the system. But that is unfair. They were made dependent, as we’ve all been made dependent. This is to say, at this point, we are all interdependent. That is what society means.

No one is self-made in our society. Those born into privilege, whether class or race privilege, are the least self-made of us all, despite rhetoric to the contrary. Only those disconnected from and clueless about their own dependent state can talk bullshit about being self-made and rant about libertarian rhetoric.

Civilization brings many benefits. But it comes at great costs. Those who benefit the most are those at the top of society. And those who pay most of the costs are those on the bottom of society.

The predicament of modern civilization wasn’t lost on the American revolutionaries, most especially the founders. Living in the colonies, they found themselves on the edge of modern civilization. Founders like Thomas Jefferson could see in the Native Americans the past of his own people, the English, who were once a free tribal people. Many took inspiration from the Native Americans because it reminded them where they came from and reminded them that other alternatives existed.

Native Americans not only had more freedom than the colonists had, but also less inequality. Thomas Paine recognized that the two went hand in hand, and that they were based on the lack of privately owned land. To privatize the Commons was to destroy the ancient right to the Commons (as described in the English Charter of the Forest, although going back to Northern European common law) and so was to sacrifice one of the most basic freedoms.

There was a reason the colonists were obsessed with the rights of Englishmen. They were ancient rights, often having preceded the British Empire and even preceded the establishment of an English monarchy by the Norman Conquest. It was what all of English and Anglo-American society was built on.

The land enclosure movement shredded the social contract and upended the entire social order. It was the most brazen act of theft in English history. It was theft from the many to profit the few. In England, it meant theft from the working class (making them into landless peasants). Similarly, in the colonies, it meant theft from the Native Americans (accompanied with genocide, ethnic cleansing, and reservations that made them wards/prisoners of the state). In both cases, that theft has yet to be compensated. The descendents of the robbers still profit from that theft. The injustice goes on, generation after generation. The theft didn’t happen just once, but is an ongoing process. The theft wasn’t justified then and remains unjustified.

Paine accepted that the loss of the Commons was maybe inevitable. But he didn’t think the act of theft was inevitable. He demanded the public be compensated, and that compensation should be for every generation. It was morally wrong to impose theft of the Commons onto future generations by privatizing land without compensating those future generations for what was taken from them before they were born.

It wasn’t just about compensating wealth as if it were just an economic transaction. No, it was more about offsetting the loss of freedoms with the gain of other freedoms, and it was about how forcing sacrifice onto people who didn’t choose it morally demanded society accept the social responsibility this entails. When people are forced into a state of dependence by the ruling elite of a society, the ruling elite and their descendent beneficiaries lose all rights to complain about those people being in a state of dependence.

In losing the Commons, people lost their independence. They were made dependents against their will, forced into laboring for others or else forced into unemployed poverty. That is the price of civilization. The fair and just thing would be to ensure that the costs and benefits of that act are evenly shared by all.

A Sign of Decline?

Considering the fall of early civilizations, decentralization and privatization come into focus as important for understanding. They have, in at least some major examples, been closely associated with periods of decline. That doesn’t necessarily imply they are the cause. Instead, it may simply be a sign. Decline just means change. An old order changing will always be perceived of as a decline.

The creation of centralized government is at the heart of the civilization project. Prior to the city-states and empires of ‘civilization’, people governed themselves in many ways without any need of centralization for most everything had been local. It required the centralization (i.e., concentration) of growing populations to create the possibility and necessity of centralized governance. A public had to be formed in opposition to a private in order for a public good to be spoken of. Privacy becomes more valued in crowded cities.

Early hunter-gatherers seemed to have thought a lot less about privacy, if they had any concept of it at all, certainly not in terms of sound-dampening walls and locked doors. In that simpler lifestyle, most everything was held in common. There was no place that was the Commons for all of the world was considered the commons, for the specific people in question. Personal items would have been the exception, rather than the rule. Before the modern condition of extreme scarcity and overpopulation, there would have been less motivation to fight over private property.

It should be no surprise that periods of decentralization and privatization coincided with periods of population dispersal and loss. The decline of civilizations often meant mass death or migration.

We live in different times, of course. But is it really any different now?

When people today advocate decentralization and privatization, what does that mean in the larger sense? When some fantasize about the decline of our present social order, what do they hope will result? What is motivating all this talk?

If it is a sign, what is a sign of? What changes are in the air?

* * * *

1177 B.C.: The Year Civilization Collapsed
Eric H. Cline
pp. 152-4

DECENTRALIZATION AND THE RISE OF THE PRIVATE MERCHANT There is one other point to be considered, which has been suggested relatively recently and may well be a reflection of current thinking about the role of decentralization in today’s world.

In an article published in 1998, Susan Sherratt, now at the University of Sheffield, concluded that the Sea Peoples represent the final step in the replacement of the old centralized politico-economic systems present in the Bronze Age with the new decentralized economic systems of the Iron Age— that is, the change from kingdoms and empires that controlled the international trade to smaller city-states and individual entrepreneurs who were in business for themselves. She suggested that the Sea Peoples can “usefully be seen as a structural phenomenon, a product of the natural evolution and expansion of international trade in the 3rd and early 2nd millennium, which carried within it the seeds of the subversion of the palace-based command economies which had initiated such trade in the first place.” 57

Thus, while she concedes that the international trade routes might have collapsed, and that at least some of the Sea Peoples may have been migratory invaders, she ultimately concludes that it does not really matter where the Sea Peoples came from, or even who they were or what they did. Far more important is the sociopolitical and economic change that they represent, from a predominantly palatial-controlled economy to one in which private merchants and smaller entities had considerably more economic freedom. 58

Although Sherratt’s argument is elegantly stated, other scholars had earlier made similar suggestions. For example, Klaus Kilian, excavator of Tiryns, once wrote: “After the fall of the Mycenaean palaces, when ‘private’ economy had been established in Greece, contacts continued with foreign countries. The well-organized palatial system was succeeded by smaller local reigns, certainly less powerful in their economic expansion.” 59

Michal Artzy, of the University of Haifa, even gave a name to some of the private merchants envisioned by Sherratt, dubbing them “Nomads of the Sea.” She suggested that they had been active as intermediaries who carried out much of the maritime trade during the fourteenth and thirteenth centuries BC. 60

However, more recent studies have taken issue with the type of transitional worldview proposed by Sherratt. Carol Bell, for instance, respectfully disagrees, saying: “It is simplistic … to view the change between the LBA and the Iron Age as the replacement of palace administered exchange with entrepreneurial trade. A wholesale replacement of one paradigm for another is not a good explanation for this change and restructuring.” 61

While there is no question that privatization may have begun as a by-product of palatial trade, it is not at all clear that this privatization then ultimately undermined the very economy from which it had come. 62 At Ugarit, for example, scholars have pointed out that even though the city was clearly burned and abandoned, there is no evidence either in the texts found at the site or in the remains themselves that the destruction and collapse had been caused by decentralized entrepreneurs undermining the state and its control of international trade. 63

In fact, combining textual observations with the fact that Ugarit was clearly destroyed by fire, and that there are weapons in the debris, we may safely reiterate that although there may have been the seeds of decentralization at Ugarit, warfare and fighting almost certainly caused the final destruction, with external invaders as the likely culprits. This is a far different scenario from that envisioned by Sherratt and her like-minded colleagues. Whether these invaders were the Sea Peoples is uncertain, however, although it is intriguing that one of the texts at Ugarit specifically mentions the Shikila/ Shekelesh, known from the Sea Peoples inscriptions of Merneptah and Ramses III.

In any event, even if decentralization and private individual merchants were an issue, it seems unlikely that they caused the collapse of the Late Bronze Age, at least on their own. Instead of accepting the idea that private merchants and their enterprises undermined the Bronze Age economy, perhaps we should consider the alternative suggestion that they simply emerged out of the chaos of the collapse, as was suggested by James Muhly of the University of Pennsylvania twenty years ago. He saw the twelfth century BC not as a world dominated by “sea raiders, pirates, and freebooting mercenaries,” but rather as a world of “enterprising merchants and traders, exploiting new economic opportunities, new markets, and new sources of raw materials.” 64 Out of chaos comes opportunity, at least for a lucky few, as always.

Early Civilizations and Religions, Travel and Influence

One of my earliest interests is that of early religions, their beliefs and mythologies, and how they formed.

Most specifically, what has fascinated me the most are the numerous similarities between religions from diverse societies that were separated by vast distances, separated by oceans and mountains and continents, not to mention separated by languages. In the ancient world when travel could take years from one place to another, these weren’t insignificant obstacles to cross-cultural influence. However, there was surprisingly a lot of travel between the earliest civilizations.

“Such transfers of ideas undoubtedly took place not only at the upper levels of society, but also at the inns and bars of the ports and cities along the trade routes in Greece, Egypt, and the Eastern Mediterranean. Where else would a sailor or crew member while away the time waiting for the wind to shift to the proper quarter or for a diplomatic mission to conclude its sensitive negotiations, swapping myths, legends, and tall tales? Such events may perhaps have contributed to cultural influences spreading between Egypt and the rest of the Near East, and even across the Aegean. Such an exchange between cultures could possibly explain the similarities between the Epic of Gilgamesh and Homer’s later Iliad and Odyssey, and between the Hittite Myth of Kumarbi and Hesiod’s later Theogony.”

Cline, Eric H. (2014-03-23). 1177 B.C.: The Year Civilization Collapsed (Turning Points in Ancient History) (p. 59). Princeton University Press. Kindle Edition.

A millennia before the Axial Age, there were some of the greatest of early civilizations. During this era, the pyramids were already a thousand years old. Then something caused all these interconnected civilizations to collapse.

I wonder how that relates to the later rise of the Axial Age civilizations and the religions that went with them. Did the collapse set the stage for entirely new systems of ideas, politics, economics, and social order?

Borders in Europe and the World

I’ve been reading about European history lately and came across some map videos. I thought I’d share, as they are interesting. It helps me understand what I’ve been reading.

The first one is about the past 1,000 years of European history. That is the time-frame I’ve been focusing upon, most specifically the Roman Empire.

One thing that stood out to me in my reading is how the Romans weren’t identified as Europeans in terms of culture, politics, and economics. Most of their attention was focused toward the Mediterranean, North Africa, and the Near East. At the height of their power, they controlled all the land directly surrounding the Mediterranean Sea. This is clearly seen as the Empire grows in the first video:

The next one also focuses on Europe with a larger time-frame. The one after that further broadens the scope to other geographical regions. I also included two other videos for the fun of it. The third one below untangles the complexity of Great Britain, United Kingdom, and England. And the last explores truly complex borders in the world.