Bullshit Jobs and Essential Workers

“In our society, there seems to be a general rule that, the more obviously one’s work benefits other people, the less one is likely to be paid for it”
~David Graeber, Bullshit Jobs

“Say what you like about nurses, garbage collectors, or mechanics, it’s obvious that were they to vanish in a puff of smoke, the results would be immediate and catastrophic. A world without teachers or dockworkers would soon be in trouble…It’s not entirely clear how humanity would suffer were all private equity CEO’s, lobbyists, PR researchers, actuaries, telemarketers, bailiffs or legal consultants to similarly vanish.”
~David Graeber, Bullshit Jobs

States are bailing out privately owned corporations’ #bullshitjobs with public money. No doubt austerity measures down the line will hit the very public sector workers we now call ‘essential’.”
~Tashina Blom

“What if an economy that forces poor people of color to wear diapers all day processing chicken parts during a pandemic isn’t an economy worth saving?”
~love one another

David Graeber: “Will we then pretend that everything was just a dream?”
from Zeit Online

David Graeber: Because the market is not so much based on supply and demand as we are always told – who makes how much is a question of political power. The current crisis makes it even clearer that my wages do not depend on how much my profession is actually used.

ZEIT ONLINE: This is the issue in your current book Bullshit Jobs : Many socially indispensable jobs are poorly paid – while well-paid employees often doubt whether their office work makes any sense at all or whether they are only doing a “bullshit job”.

Graeber: What is important to me: I would never contradict people who feel that they are making an important contribution with their work. For my book, however, I have collected voices from people who do not have exactly this feeling: They are sometimes deeply frustrated because they want to contribute to the good of all of us. But to make enough money for their families, they have to do the jobs that don’t work for anyone. People said to me: I worked as a kindergarten teacher, it was great and fulfilling and important work, but I couldn’t pay my bills anymore. And now I’m working for some subcontractor that provides health insurance with information. I tag some forms all day, no one reads my reports, but I earn twenty times as much.

ZEIT ONLINE: What happens to these office workers who are now doing their bullshit jobs because of the corona virus from their home office?

Graeber: Some people now contact me and say: I always suspected that I could do my job two hours a week, but now I actually know that it is. Because as soon as you do this from home, for example, the meetings that don’t do anything are often dropped.

Coronavirus Unmasks the Lie That You Have to Work in London to Succeed
by Aimee Cliff

Remote working is set to expose more than a few fallacies about our working life. At one end of the spectrum, it might lift the veil on the nature of white-collar work itself. Manual workers and non-office-based professionals are risking the lives of their loved ones to continue working while others – like me – are quickly able to dismantle and digitise our office cultures. As anthropologist David Graeber’s 2018 book Bullshit Jobs pointed out, a huge amount of our economy is predicated on the illusion that many people have to come into an office from 9 AM to 5 PM every day in order to create content, send messages, and schedule social media posts.

Or, as Twitter user @MikiZarzycki put it for the coronavirus era: “Everyone with a fake job gets to stay home and get paid to drop funny GIFs into Slack, everyone with a real job has to be a frontline pandemic worker or get fired.”

Coronavirus – Is telework identifying our Bullshit Jobs?
from GenX @ 50

The epidemic has resulted in statewide lockdowns in more and more states. With schools, businesses, and government offices closing or being limited in their services, people are teleworking if it is possible, being laid off if it is not possible, or still working if they perform an “essential” function. The truly essential jobs – keeping the food supply chain intact, medical work, trash collection, and other life sustaining and disease preventing professions clearly are not bullshit jobs. Other jobs like teaching, restaurant work, or manufacturing, are not bullshit, but can’t be done when under quarantine.

But the Bullshit Jobs associated with Graeber’s categories – flunky, box ticker, taskmaster – are all easily done when working remotely. In fact, if you can do your work remotely, it might be a good sign that you have a Bullshit Job!

I would argue that many of these bullshit jobs add negative value to an organization, creating useless paperwork, internal regulations, and otherwise throwing sand in organizational gears that might otherwise run more smoothly. Having these things not be done might improve overall productivity. Will anyone examine how things worked after the COVID-19 telework is over and decide that many of these administrative jobs were unnecessary? Perhaps it might be worth it to the bottom line to continue to pay some flunkies, goons, box-tickers, and taskmasters to not come in to work when this is over.

The COVID-19-Induced Crisis and Three Inversions of Neoliberalism
by Roderick Condon

If neoliberals truly understood economics they wouldn’t be neoliberals. Against Friedrich Hayek’s assertion that socialists don’t understand economics, Covid-19 exposes the neoliberal location of social value exclusively in the profit-making activities of private enterprise as misapprehending the essential basis of value creating activity in the reproduction of society itself. Suddenly, it is automatically and immediately apparent those services necessary for the continuity of society as a going concern as those, to appropriate a phrase from Louis Althusser, reproducing the conditions of production.

Two insights follow from this. First, the devaluation – in both material and symbolic terms – of use-values by exchange-values under neoliberalism. Financial activity, only barely distinguishable from compulsive gambling, has been elevated to the highest social importance while vital reproductive activity has been, in effect, beaten down, raped and systematically pillaged. Second, David Graeber’s aptly conceptualized ‘bullshit jobs’ are now exposed as the very foundation of a farcical social order in which all activity must constitute itself in exclusively economic terms and measure itself accordingly. The decelerated pace of economic life induced by Covid-19 directly reveals the superfluity of a great deal of what constitutes ‘productivity’ under neoliberalism as in reality socially unnecessary labour-time, to refashion Marx. Furthermore, the forced imposition of such activity by the social order is itself revealed as a type of hidden tax (something the neoliberal economists show a great deal of disdain for) on real, lived life-time; that is, the time available in each individuals’ lifespan for activities that truly matter.

The bullshit economy II: Bullshit-ish jobs and the coronavirus recession
by Andrew Mackay

I will revisit the difference between “the economy” (the method by which people obtain goods and services, through work or a welfare state) and “the Economy” (a reified concept based on a few stock indexes and how well billionaires and their conglomerates are doing) at a later date. I will focus on this post in how much the economy has been stripped down. Finding out which jobs are “essential” (largely the supply chains for food and medical equipment, along with education, though they are full of administrative layers and do-nothing middlemen skimming money off the top) and which are not is instructive. This is a natural experiment to go beyond the Bullshit Jobs framework, which relied on above-mentioned pollinga few hundred people who emailed about the bullshit parts (or wholes) of their jobs, and Graeber’s mastery of theory creation from an anthropological lens.

Landlords? Pure parasites, who get others to pay their mortgages and expansion, avoiding providing services as much as possible, which could be done collectively by tenants anyways.

Office jobs? Bullshit-ish, at the very least, if not total bullshit. The mass movement to working from home and teleconferencing within a couple of weeks indicates what a useless, environmentally-destroying artifice the office is. The office is an instrument of social control, whereby the bosses use the magic of at-will employment to add unneeded stress on people who know how to do their jobs infinitely better than management. With a huge drop in commuting, Los Angeles has some of the cleanest air it has ever had in the automobile era. Millions of hours of commuting and busywork have been cut, and people are able to balance whatever workload they actually have with accomplishing creative pursuits or otherwise having more time in the day. Graeber perceptively points out that many jobs have huge amounts of busywork because some jobs (like system administrators) require people to be on-call for a certain number of hours, but may frequently have no urgent work to do. Management hates to pay people to do nothing of substance, so they use the artifice of the office as a social control mechanism to feel they are getting their money’s worth and justify their existence.

It is clear that many jobs have bullshit-ish aspects to them. Some aspects, like interminable face-to-face meetings that could be sorted out in a ten-minute Slack chat, still persist. The “essential”, who are generally treated like dirt when there isn’t a crisis, show how little match-up there is between pay and social usefulness. A grocery store truck driver has orders of magnitude more importance than his superiors, and they could collectively management the supply chain with their co-workers, having so many years of combined experience on how food goes from farms to shelves. Countries like Denmark are paying a majority of laid-off workers’ salaries, though it should be re-evaluated what these workers should be paid given the social value of their work. 75% of salary seems okay (not ideal, but better than the nothing coming from America), but 75% of what, exactly? Marx’s labor theory of value has come into acute relevance in the past month, as it becomes clear who actually creates value (workers), and who is expendable (administrators, corporate executives, and industries like cruises and shale oil that have no future in a decarbonized economy).

What will the world be like after coronavirus? Four possible futures
by Simon Mair

The key to understanding responses to COVID-19 is the question of what the economy is for. Currently, the primary aim of the global economy is to facilitate exchanges of money. This is what economists call “exchange value”.

The dominant idea of the current system we live in is that exchange value is the same thing as use value. Basically, people will spend money on the things that they want or need, and this act of spending money tells us something about how much they value its “use”. This is why markets are seen as the best way to run society. They allow you to adapt, and are flexible enough to match up productive capacity with use value.

What COVID-19 is throwing into sharp relief is just how false our beliefs about markets are. Around the world, governments fear that critical systems will be disrupted or overloaded: supply chains, social care, but principally healthcare. There are lots of contributing factors to this. But let’s take two.

First, it is quite hard to make money from many of the most essential societal services. This is in part because a major driver of profits is labour productivity growth: doing more with fewer people. People are a big cost factor in many businesses, especially those that rely on personal interactions, like healthcare. Consequently, productivity growth in the healthcare sector tends to be lower than the rest of the economy, so its costs go up faster than average.

Second, jobs in many critical services aren’t those that tend to be highest valued in society. Many of the best paid jobs only exist to facilitate exchanges; to make money. They serve no wider purpose to society: they are what the anthropologist David Graeber calls “bullshit jobs”. Yet because they make lots of money we have lots of consultants, a huge advertising industry and a massive financial sector. Meanwhile, we have a crisis in health and social care, where people are often forced out of useful jobs they enjoy, because these jobs don’t pay them enough to live.

The coronavirus pandemic might have a silver lining. People might wake up to what’s really important.
by Peter Bolton

What jobs are really ‘essential’?

The first big question is: what jobs does society really need? Could it be that some are not only unnecessary but also harmful? And if so, could we just get rid of them? In the US healthcare industry, for example, private health insurance companies have ‘claims teams’ that determine whether the company will cover the cost of treatments for their policyholders. Such workers are even rewarded by their bosses for saving the company money by finding (often spurious) reasons for denying payment. Transitioning to a public system of universal care would eliminate this needless overhead and, in turn, lower healthcare costs.

Many jobs in the finance sector, meanwhile, are equally worthless. The 2007/8 financial crash, for instance, was caused in part by the bundling and trade of ‘subprime mortgage’ debt. And as The Canary has previously argued, financial markets increasingly resemble an imaginary world that bears no relation to actual production. This raises the question of whether jobs such as ‘stockbroker’, ‘currency trader’, or ‘speculator’ could simply be abolished. […]

Who really benefits?

If many jobs are pointless and many goods and services are unnecessary, then that ultimately raises a follow-up question: why do they exist? Scholars across various disciplines have tried to answer this question. In his 2018 book Bullshit Jobs: A Theory, anthropologist David Graeber suggests that the existence of pointless jobs is part of a deliberate strategy by the ruling class to keep the masses occupied so that they won’t have the time or inclination to question (or, worse, organize to dismantle) the power structures of the status quo. He says:

The ruling class has figured out that a happy and productive population with free time on their hands is a mortal danger. …

If someone had designed a work regime perfectly suited to maintaining the power of finance capital, it’s hard to see how he or she could have done a better job. […]

Time to reflect

Ultimately, the coronavirus outbreak has shown that society can continue to function without certain kinds of work being performed – so long as governments intervene to provide for the social good. At the same time, many people in wealthier countries have realized that they can live just fine with less. And on both counts, this is exactly what socialists have been arguing all along.

Bullshit Jobs in an age of Coronavirus
by imothyt

Bullshit jobs have turned into a sort of “workfare” for the educated classes.

That’s a fact that seems inescapable now as the Coronavirus pandemic has deemed essential and non-essential. The essential people are the folks stocking shelves in the supermarket, driving long-haul trucks, delivery drivers, nurses, doctors, people manufacturing essential goods (medical and otherwise), farm workers, and food workers. The rest of us are told to stay at home, shelter in place, and devise new things to do with our time, to prove that we are productive.

The pandemic has forced us all to become task-masters, box-tickers, and duct tapers for the very (probably) bullshit jobs we held before so that we could all continue to exist at a high-level of universal basic income.

I’m not an economist but the whole system always seemed deeply flawed to me. When I was in the Army in the 80’s it was patently obvious that we were all there on a sort of welfare system. And as the military-industrial complex rose and as “pork-barrel” spending increased at the Federal level, I started wondering how many of the jobs which supplied the military and infrastructure projects (the bridge to nowhere) were just versions of workfare? If you build missiles you’re kind of just a Goon, aren’t you? The only reason we need rockets and bombs is because others have rockets and bombs!

And, all of this government “red-tape” that people says kills jobs? In my lifetime it does the exact opposite. It creates jobs! Millions and millions of jobs. Jobs for people to process oversight paperwork, efficiency modeling, insurance claims, and so on. […]

Graeber quotes President Obama after the USA passed the worst healthcare plan ever devised in human history*, “everybody who supports single-payer health care says, ‘Look at all this money we would be saving from insurance and paperwork. That represents one million, two million, three million jobs.” And all politicians know this for a fact. Running for president, Howard Schultz called universal healthcare “not American,” adding, “What industry are we going to abolish next — the coffee industry?” And said that single-payer would “wipe out the insurance industry.”

And not just the insurance industry (which is completely useless, Goon, work) but think about what Medicare for All really means. It says that it will save money – and it would – but it would do so by eliminating millions of jobs in insurance, middle-management, billing departments, claims-negotiators, oversight officials, and so on. All of those people make middle-class incomes which in turn support the people who do that actual work of our society.

That’s why Trump needs so many people to just go back to work and why he literally doesn’t care if we live or die from this virus or really from any of the existential threats we face (global warming, etc.). I’ve long held the sneaking suspicion that most of human endeavor (especially in the West) is a con of some sort. Getting people to do stuff that they probably wouldn’t want to do by tempting them with baubles like Harleys or new cars. The economy relies on people doing all of these bullshit jobs because the economy is bullshit and only functions as long as we are producing bullshit wealth for a bullshit class of top bullshitters!

Coronavirus and the Collapse of Our Imaginations
by Jonathan Carp

Millions of us have what David Graeber calls “bullshit jobs,” jobs that produce nothing, create no wealth, but exist merely to help circulate money so goods can be distributed. Even white-collar workers with real jobs are chained to 19th-century notions of work, with a desk in a building and appointed hours at which they must sit there. We rise to alarm clocks, get into cars, belch carbon into the atmosphere, and alternate between working and goofing off as we wait for the time to pass.

But not under coronavirus. Under coronavirus, we wake with the sun, we take leisurely morning strolls, we fit our work around our children and our spouses. Instead of furtively scrolling Facebook when we get bored working, we play or make love or create. For many of us, coronavirus has been liberating amidst the quarantines. How ghastly that it has taken the threat of a global pandemic for our bosses to take advantage of technology that has existed for twenty years, at least. How cowardly of us not to demand it sooner.

What if we never went back? Imagine roads clear of traffic around the clock. Imagine air cleansed of the emissions of millions of cars. Imagine the demand for gas dropping first the price, then the environmentally devastating production. For my fellow office drones, imagine every morning waking up naturally, not to an alarm clock, and spending each day doing at each moment what you most wanted to do, not whatever would pass the time while waiting for five o’clock. That could be ours, if only we insist on it.

And what more could we imagine? Could we imagine, as my former colleague Kevin Carson has described in his work, a world of decentralized production, where “going to work” is for almost everyone a strange anachronism from a dimly remembered past? Could we imagine a world of automation that serves people rather than displaces them? Or will we be content to fritter with the margins of neoliberal capitalism, pushing for “oversight” on massive giveaways to corporations while villains like Ben Sasse clutch their pearls at the idea of a fast-food worker making more on unemployment than she does flipping burgers?

Adrian Ivakhiv: Pandemic politics, or what a disaster can do for us
by Adrian Ivakhiv

For me, this is in part a reaction against the push for “business as usual” in these strange, new times. “Keeping calm and carrying on” works for some, but easily becomes an excuse for disaster capitalism: if you can’t work normally, we’ll have you work from home. (That your kids are suddenly there with you all day, “zooming” into their classes, and that you’ve just brought your mother-in-law home from her precarious seniors’ community, and that the fridge is getting empty, is all irrelevant.) We’ll have you work harder to learn new tools that we can then require you to use when things have returned to “normal” (and if you don’t, then someone else can fill your shoes).

The other strategy is to stop and ask ourselves what’s really important. What do you need to do to protect your loved ones? Do you even know who your loved ones are? (How wide does that circle extend?) What work will keep you going in a world where business-as-usual has become an unaffordable luxury? When there’s so much to do to be happy and safe, some “bullshit jobs,” as anthropologist David Graeber call them (no mincing words), might start to look expendable.

Taking stock, for me, means asking: how can institutions of higher learning reach out to the communities we serve to help us transition into times of likely scarcity, in which the temptation for hoarding, closing borders, and “disaster capitalizing” — the temptation of the Handmaid’s Tale — will be all too palpable? How do we re-engineer our societies to preserve and enhance democracy, equality, and ecological integration when things get bad, as any good “disaster environmentalist” knows they will? That’s the challenge ahead of us, and COVID-19 is its messenger.

What’s the point?
by Anne-Sophie Moreau

Coronavirus acts like a daunting mirror, reflecting the sheer pointlessness of what we do. It exposes a phenomenon described by anthropologist David Graeber as “bullshit jobs”: most of us, he argues, occupy positions which at best, make no difference to society, and at worst, can be downright harmful. He says the ranks of big firms are filled with minions whose sole purpose is to flatter their boss’ ego, or fill in charts as part of painstaking but ultimately pointless “processes”. That’s when they’re not busy selling goods and services that empty the consumer’s pocket whilst exhausting the planet’s natural resources. In short, entire swaths of professional activity shouldn’t even exist at all! Surely that should put you off organising yet another meeting during the coronavirus crisis. […]

After all, “bullshit jobs” haven’t put an end to “shitty jobs”, Graeber explains. On the contrary – and this is why he thinks our societies are paradoxical –, the more useful we are, the less we’re paid. How many of our government ministers are truly interested in the foot soldiers of our digital platforms? Not many; and when they do speak to them, it’s to tell them to get to work! Bullshit jobs at home, shitty jobs on the front – this is the sad dystopia we’re living in. Not to mention that many service industry jobs will likely be replaced by AI, and that central banks are thinking of showering us with “helicopter money” to avoid a global recession… Will tomorrow’s office workers be forced to stay at home, force-fed with Netflix and free money? […]

Paradoxically, this crisis might help us rediscover the real reasons why we work. By hitting the rock bottom of uselessness, we might find a way to rise back to the surface of our ambitions. And yes, these might indeed seem futile. But even the act of drawing dinosaurs can be useful, Graeber argues. Does this surprise you? “I lean towards Spinoza’s theory of work, where the aim is to increase or preserve other people’s freedom”, he told me, when I expressed my surprise at him classifying entertainment as “useful”. He went on: “The paradigmatic form of freedom is chosen activity – in other words, play. Somewhere Marx wrote that you only attain real freedom when you leave the realm of necessity and work becomes an end in itself. That might be the new paradigm of social value: to care for others, to make sure everyone leads a freer, more leisurely life.”

Notes from a Pandemic
by Tammy Sanders

One refrain I keep hearing from friends with stock portfolios and retirement funds is that we’ve got to reopen the economy. But really, is that the best we can think to do, reopen an economy that typically disenfranchised the most valuable people in it?

Instead of reopening the economy, why not rethink it, rework it, redesign it toward the more ethical, just and sensible society so many of us want to have.

An example: I wonder now that so many men, millions of them, have for the first time in their adult lives spent the majority of their waking hours in the company of their children, could we see a fundamental shift in policy norms and standards around parental leave and flexible work. Conceding that some men cannot wait to get back to being away for 14-hour days, I also wonder how many more will no longer abide prioritizing their professions at the expense of their families.

In his book Bullshit Jobs, David Graeber talks at length about the notion of care-related work, particularly how and why our society devalues that work. Nowadays, we’re honking horns and applauding health care providers and grocery store cashiers as “heros” — but are we willing to insist they be paid a hero’s wage, perhaps 1/16th what an MLB pitcher or NFL quarterback earns?

Might we refuse to send children back to school, or better yet, might kids strike and refuse to go back to school until adults sort out school shootings?

Might we, as Graeber suggests in his book, commit whatever effort we can to stop making so much of what has until now made life unlivable for so many: unbearable traffic, inflexible work, toxic air, a ruthless pursuit of achievement at the expense of connection?

We crafted the world we lived in on 1 March 2020. Then, we stopped that world. If there was ever a time to point the world toward wellness, wholeness, more positivity, less polarization, now is that time.

Balance of Egalitarianism and Hierarchy

David Graeber, an anthropologist, and David Wengrow, an archaeologist, have a theory about hunter-gatherer societies having cycled between egalitarianism and hierarchy. That is to say hierarchies were temporary and often seasonal. There was no permanent leadership or ruling caste, as seen in the fluid social order of still surviving hunter. This carried over into the early settlements that were initially transitory meeting places, likely for feasts and festivals.

There are two questions that need to be answered. First, why did humans permanently settle down? Second, why did civilization get stuck in hierarchy? These questions have to be answered separately. For millennia into civilization, the egalitarian impulse persisted within many permanent settlements. There was no linear development from egalitarianism to hierarchy, no fall from the Garden of Eden.

Julian Jaynes, in his theorizing about the bicameral mind, offered a possible explanation. A contributing factor for permanent settlements would be because the speaking idols had to be kept in a single location with agriculture developing as a later result. Then as societies became more populous, complex and expansive, hierarchies (as with moralizing gods) became more important to compensate for the communal limits of a voice-hearing social order.

That kind of hierarchy, though, was a much later development, especially in its extreme forms not seen until the Axial Age empires. The earlier bicameral societies had a more communal identity. That would’ve been true on the level of experience, as even the voices people heard were shared. There wasn’t an internal self separate from the communal identity and so no conflict between the individual member and larger society. One either fully belonged to and was immersed in that culture or not.

Large, complex hierarchies weren’t needed. Bicameralism began in small settlements that lacked police, court systems, standing armies, etc — all the traits of an oppressively authoritarian hierarchy that would later be seen, such as the simultaneous appearance of sexual moralizing and pornographic art. It wasn’t the threat of violent force by centralized authority and concentrated power that created and maintained the bicameral order but, as still seen with isolated indigenous tribes, shared identity and experience.

An example of this is that of early Egyptians. They were capable of impressive technological feats and yet they didn’t even have basic infrastructure like bridges. It appears they initially were a loose association of farmers organized around the bicameral culture of archaic authorization and, in the off-season, they built pyramids without coercion. Slavery was not required for this, as there is no evidence of forced labor.

In so many ways, this is alien to the conventional understanding of civilization. It is so radically strange that to many it seems impossible, especially when it gets described as ‘egalitarian’ in placing it in a framework of modern ideas. Mention primitive ‘communism’ or ‘anarchism’ and you’ll really lose most people. Nonetheless, however one wants to describe and label it, this is what the evidence points toward.

Here is another related thought. How societies went from bicameral mind to consciousness is well-trodden territory. But what about how bicameralism emerged from animism? They share enough similarities that I’ve referred to them as the animistic-bicameral complex. The bicameral mind seems like a variant or extension of the voice-hearing in animism.

Among hunter-gatherers, it was often costume and masks through which gods, spirits, and ancestors spoke. Any individual potentially could become the vessel of possession because, in the animistic view, all the world is alive with voices. So, how did this animistic voice-hearing become narrowed down to idol worship of corpses and statues?

I ask this because this is central to the question of why humans created permanent settlements. A god-king’s voice of authorization was so powerful that it persisted beyond his death. The corpse was turned into a mummy, as his voice was a living memory that kept speaking, and so god-houses were built. But how did the fluid practice of voice-hearing in animism become centralized in a god-king?

Did this begin with the rise of shamanism? Some hunter-gatherers don’t have shamans. But once the role of shaman becomes a permanent authority figure mediating with other realms, it’s not a large leap from a shaman-king to a god-king who could be fully deified in death. In that case, how did shamanism act as a transitional proto-bicameralism? In this, we might begin to discern the hitch upon which permanent hierarchy eventually got stuck.

I might point out that there is much disagreement in this area of scholarship, as expected. The position of Graeber and Wengrow is highly contested, even among those offering alternative interpretations of the evidence see Peter Turchin (An Anarchist View of Human Social Evolution & A Feminist Perspective on Human Social Evolution) and Camilla Power (Gender egalitarianism made us human: patriarchy was too little, too late & Gender egalitarianism made us human: A response to David Graeber & David Wengrow’s ‘How to change the course of human history’).

But I don’t see the disagreements as being significant for the purposes here. Here is a basic point that Turchin explains: “The reason we say that foragers were fiercely egalitarian is because they practiced reverse dominance hierarchy” (from first link directly above). That seems to go straight to the original argument. Many other primates have social hierarchy, although not all. Some of the difference appears to be cultural, in that humans early in evolution appear to have developed cultural methods of enforcing egalitarianism. This cultural pattern has existed long enough to have fundamentally altered human nature.

According to Graeber and Wengrow, these egalitarian habits weren’t lost easily, even as society became larger and more complex. Modern authoritarian hierarchies represent a late development, a fraction of a percentage of human existence. They are far outside the human norm. In social science experiments, we see how the egalitarian impulse persists. Consider two examples. Children will naturally help those in need, until someone pays them money to do so, shifting from intrinsic motivation to extrinsic. The other study showed how most people, both children an adults, will choose to punish wrongdoers even at personal cost.

This in-built egalitarianism is an old habit that doesn’t die easily no matter how it is suppressed or perverted by systems of authoritarian power. It is the psychological basis of a culture of trust that permanent hierarchies take advantage of through manipulation of human nature. The egalitarian impulse gets redirected in undermining egalitarianism. This is why modern societies are so unstable, as compared to the ancient societies that lasted for millennia.

That said, there is nothing wrong with genuine authority, expertise, and leadership — as seen even in the most radically egalitarian societies like the Piraha. Hierarchies are also part of our natural repertoire and only problematic when they fall out of balance with egalitarianism and so become entrenched. One way or another, human societies cycle between hierarchy and egalitarianism, whether it cycles on a regular basis or necessitates collapse. That is the point Walter Scheidel makes in his book, The Great Leveler. High inequality destabilizes society and always brings its own downfall.

We need to relearn that balance, if we hope to avoid mass disaster. Egalitarianism is not a utopian ideal. It’s simply the other side of human nature that gets forgotten.

* * *

Archaeology, anarchy, hierarchy, and the growth of inequality
by Andre Costopoulos

In some ways, I agree with both Graeber and Wengrow, and with Turchin. Models of the growth of social inequality have indeed emphasized a one dimensional march, sometimes inevitable, from virtual equality and autonomy to strong inequality and centralization. I agree with Graeber and Wengrow that this is a mistaken view. Except I think humans have moved from strong inequality, to somewhat managed inequality, to strong inequality again.

The rise and fall of equality

Hierarchy, dominance, power, influence, politics, and violence are hallmarks not only of human social organization, but of that of our primate cousins. They are widespread among mammals. Inequality runs deep in our lineage, and our earliest identifiable human ancestors must have inherited it. But an amazing thing happened among Pleistocene humans. They developed strong social leveling mechanisms, which actively reduced inequality. Some of those mechanisms are still at work in our societies today: Ridicule at the expense of self-aggrandizers, carnival inversion as a reminder of the vulnerability of the powerful, ostracism of the controlling, or just walking away from conflict, for example.

Understanding the growth of equality in Pleistocene human communities is the big untackled project of Paleolithic archaeology, mostly because we assume they started from a state of egalitarianism and either degenerated or progressed from there, depending on your lens. Our broader evolutionary context argues they didn’t.

During the Holocene, under increasing sedentism and dependence on spatially bounded resources such as agricultural fields that represent significant energy investments, these mechanisms gradually failed to dampen the pressures for increasing centralization of power. However, even at the height of the Pleistocene egalitarian adaptation, there were elites if, using Turchin’s figure of the top one or two percent, we consider that the one or two most influential members in a network of a hundred are its elite. All the social leveling in the world could not contain influence. Influence, in the end, if wielded effectively, is power.

Ancient ‘megasites’ may reshape the history of the first cities
by Bruce Bower

No signs of a centralized government, a ruling dynasty, or wealth or social class disparities appear in the ancient settlement, the researchers say. Houses were largely alike in size and design. Excavations yielded few prestige goods, such as copper items and shell ornaments. Many examples of painted pottery and clay figurines typical of Trypillia culture turned up, and more than 6,300 animal bones unearthed at the site suggest residents ate a lot of beef and lamb. Those clues suggest daily life was much the same across Nebelivka’s various neighborhoods and quarters. […]

Though some of these sprawling sites had social inequality, egalitarian cities like Nebelivka were probably more widespread several thousand years ago than has typically been assumed, says archaeologist David Wengrow of University College London. Ancient ceremonial centers in China and Peru, for instance, were cities with sophisticated infrastructures that existed before any hints of bureaucratic control, he argues. Wengrow and anthropologist David Graeber of the London School of Economics and Political Science also made that argument in a 2018 essay in Eurozine, an online cultural magazine.

Councils of social equals governed many of the world’s earliest cities, including Trypillia megasites, Wengrow contends. Egalitarian rule may even have characterized Mesopotamian cities for their first few hundred years, a period that lacks archaeological evidence of royal burials, armies or large bureaucracies typical of early states, he suggests.

How to change the course of human history
by David Graeber and David Wengrow

Overwhelming evidence from archaeology, anthropology, and kindred disciplines is beginning to give us a fairly clear idea of what the last 40,000 years of human history really looked like, and in almost no way does it resemble the conventional narrative. Our species did not, in fact, spend most of its history in tiny bands; agriculture did not mark an irreversible threshold in social evolution; the first cities were often robustly egalitarian. Still, even as researchers have gradually come to a consensus on such questions, they remain strangely reluctant to announce their findings to the public­ – or even scholars in other disciplines – let alone reflect on the larger political implications. As a result, those writers who are reflecting on the ‘big questions’ of human history – Jared Diamond, Francis Fukuyama, Ian Morris, and others – still take Rousseau’s question (‘what is the origin of social inequality?’) as their starting point, and assume the larger story will begin with some kind of fall from primordial innocence.

Simply framing the question this way means making a series of assumptions, that 1. there is a thing called ‘inequality,’ 2. that it is a problem, and 3. that there was a time it did not exist. Since the financial crash of 2008, of course, and the upheavals that followed, the ‘problem of social inequality’ has been at the centre of political debate. There seems to be a consensus, among the intellectual and political classes, that levels of social inequality have spiralled out of control, and that most of the world’s problems result from this, in one way or another. Pointing this out is seen as a challenge to global power structures, but compare this to the way similar issues might have been discussed a generation earlier. Unlike terms such as ‘capital’ or ‘class power’, the word ‘equality’ is practically designed to lead to half-measures and compromise. One can imagine overthrowing capitalism or breaking the power of the state, but it’s very difficult to imagine eliminating ‘inequality’. In fact, it’s not obvious what doing so would even mean, since people are not all the same and nobody would particularly want them to be.

‘Inequality’ is a way of framing social problems appropriate to technocratic reformers, the kind of people who assume from the outset that any real vision of social transformation has long since been taken off the political table. It allows one to tinker with the numbers, argue about Gini coefficients and thresholds of dysfunction, readjust tax regimes or social welfare mechanisms, even shock the public with figures showing just how bad things have become (‘can you imagine? 0.1% of the world’s population controls over 50% of the wealth!’), all without addressing any of the factors that people actually object to about such ‘unequal’ social arrangements: for instance, that some manage to turn their wealth into power over others; or that other people end up being told their needs are not important, and their lives have no intrinsic worth. The latter, we are supposed to believe, is just the inevitable effect of inequality, and inequality, the inevitable result of living in any large, complex, urban, technologically sophisticated society. That is the real political message conveyed by endless invocations of an imaginary age of innocence, before the invention of inequality: that if we want to get rid of such problems entirely, we’d have to somehow get rid of 99.9% of the Earth’s population and go back to being tiny bands of foragers again. Otherwise, the best we can hope for is to adjust the size of the boot that will be stomping on our faces, forever, or perhaps to wrangle a bit more wiggle room in which some of us can at least temporarily duck out of its way.

Mainstream social science now seems mobilized to reinforce this sense of hopelessness.

Rethinking cities, from the ground up
by David Wengrow

Settlements inhabited by tens of thousands of people make their first appearance in human history around 6,000 years ago. In the earliest examples on each continent, we find the seedbed of our modern cities; but as those examples multiply, and our understanding grows, the possibility of fitting them all into some neat evolutionary scheme diminishes. It is not just that some early cities lack the expected features of class divisions, wealth monopolies, and hierarchies of administration. The emerging picture suggests not just variability, but conscious experimentation in urban form, from the very point of inception. Intriguingly, much of this evidence runs counter to the idea that cities marked a ‘great divide’ between rich and poor, shaped by the interests of governing elites.

In fact, surprisingly few early cities show signs of authoritarian rule. There is no evidence for the existence of monarchy in the first urban centres of the Middle East or South Asia, which date back to the fourth and early third millennia BCE; and even after the inception of kingship in Mesopotamia, written sources tell us that power in cities remained in the hands of self-governing councils and popular assemblies. In other parts of Eurasia we find persuasive evidence for collective strategies, which promoted egalitarian relations in key aspects of urban life, right from the beginning. At Mohenjo-daro, a city of perhaps 40,000 residents, founded on the banks of the Indus around 2600 BCE, material wealth was decoupled from religious and political authority, and much of the population lived in high quality housing. In Ukraine, a thousand years earlier, prehistoric settlements already existed on a similar scale, but with no associated evidence of monumental buildings, central administration, or marked differences of wealth. Instead we find circular arrangements of houses, each with its attached garden, forming neighbourhoods around assembly halls; an urban pattern of life, built and maintained from the bottom-up, which lasted in this form for over eight centuries.⁶

A similar picture of experimentation is emerging from the archaeology of the Americas. In the Valley of Mexico, despite decades of active searching, no evidence for monarchy has been found among the remains of Teotihuacan, which had its magnificent heyday around 400 CE. After an early phase of monumental construction, which raised up the Pyramids of the Sun and Moon, most of the city’s resources were channelled into a prodigious programme of public housing, providing multi-family apartments for its residents. Laid out on a uniform grid, these stone-built villas — with their finely plastered floors and walls, integral drainage facilities, and central courtyards — were available to citizens regardless of wealth, status, or ethnicity. Archaeologists at first considered them to be palaces, until they realised virtually the entire population of the city (all 100,000 of them) were living in such ‘palatial’ conditions.⁷

A millennium later, when Europeans first came to Mesoamerica, they found an urban civilisation of striking diversity. Kingship was ubiquitous in cities, but moderated by the power of urban wards known as calpolli, which took turns to fulfil the obligations of municipal government, distributing the highest offices among a broad sector of the altepetl (or city-state). Some cities veered towards absolutism, but others experimented with collective governance. Tlaxcalan, in the Valley of Puebla, went impressively far in the latter direction. On arrival, Cortés described a commercial arcadia, where the ‘order of government so far observed among the people resembles very much the republics of Venice, Genoa, and Pisa for there is no supreme overlord.’ Archaeology confirms the existence here of an indigenous republic, where the most imposing structures were not palaces or pyramid-temples, but the residences of ordinary citizens, constructed around district plazas to uniformly high standards, and raised up on grand earthen terraces.⁸

Contemporary archaeology shows that the ecology of early cities was also far more diverse, and less centralised than once believed. Small-scale gardening and animal keeping were often central to their economies, as were the resources of rivers and seas, and indeed the ongoing hunting and collecting of wild seasonal foods in forests or in marshes, depending on where in the world we happen to be.⁹ What we are gradually learning about history’s first city-dwellers is that they did not always leave a harsh footprint on the environment, or on each other; and there is a contemporary message here too. When today’s urbanites take to the streets, calling for the establishment of citizens’ assemblies to tackle issues of climate change, they are not going against the grain of history or social evolution, but with its flow. They are asking us to reclaim something of the spark of political creativity that first gave life to cities, in the hope of discerning a sustainable future for the planet we all share.

Farewell to the ‘Childhood of Man’
by Gyrus

[Robert] Lowie made similar arguments to [Pierre] Clastres, about conscious knowledge of hierarchies among hunter-gatherers. However, for reasons related to his concentration on Amazonian Indians, Clastres missed a crucial point in Lowie’s work. Lowie highlighted the fact that among many foragers, such as the Eskimos in the Arctic, egalitarianism and hierarchy exist within the same society at once, cycling from one to another through seasonal social gatherings and dispersals. Based on social responses to seasonal variations in the weather, and patterns in the migration of hunted animals, not to mention the very human urge to sometimes hang out with a lot of people and sometimes to get the hell away from them, foraging societies often create and then dismantle hierarchical arrangements on a year-by-year basis.

There seems to have been some confusion about exactly what the pattern was. Does hierarchy arise during gatherings? This would tally with sociologist Émile Durkheim’s famous idea that ‘the gods’ were a kind of primitive hypothesis personifying the emergent forces that social complexity brought about. People sensed the dynamics changing as they lived more closely in greater numbers, and attributed these new ‘transcendent’ dynamics to organised supernatural forces that bound society together. Religion and cosmology thus function as naive mystifications of social forces. Graeber detailed ethnographic examples where some kind of ‘police force’ arises during tribal gatherings, enforcing the etiquette and social expectations of the event, but returning to being everyday people when it’s all over.

But sometimes, the gatherings are occasions for the subversion of social order — as is well known in civilised festivals such as the Roman Saturnalia. Thus, the evidence seemed to be confusing, and the idea of seasonal variations in social order was neglected. After the ’60s, the dominant view became that ‘simple’ egalitarian hunter-gatherers were superseded by ‘complex’ hierarchical hunter-gatherers as a prelude to farming and civilisation.

Graeber and Wengrow argue that the evidence isn’t confusing: it’s simply that hunter-gatherers are far more politically sophisticated and experimental than we’ve realised. Many different variations, and variations on variations, have been tried over the vast spans of time that hunter-gatherers have existed (over 200,000 years, compared to the 12,000 or so years we know agriculture has been around). Clastres was right: people were never naive, and resistance to the formation of hierarchies is a significant part of our heritage. However, seasonal variations in social structures mean that hierarchies may never have been a ghostly object of resistance. They have probably been at least a temporary factor throughout our long history.1 Sometimes they functioned, in this temporary guise, to facilitate socially positive events — though experience of their oppressive possibilities usually encouraged societies to keep them in check, and prevent them from becoming fixed.

How does this analysis change our sense of the human story? In its simplest form, it moves the debate from ‘how and when did hierarchy arise?’ to ‘how and when did we get stuck in the hierarchical mode?’. But this is merely the first stage in what Graeber and Wengrow promise is a larger project, which will include analysis of the persistence of egalitarianism among early civilisations, usually considered to be ‘after the fall’ into hierarchy.

 

“They join the army because they want to be like you.”

Below is David Graeber explaining the political right (“Army of Altruists” from Revolutions in Reverse). The view is similar to that of Joe Bageant.

It’s not that they have gone mad. Don’t callously dismiss them, as stupid. No, they aren’t voting against their own interests. It’s that the world they live in offers them few, if any, good choices. They are responding to a society that no longer values justice and no longer honors the agreed upon social contract, at least what many Americans thought was agreed upon.

This is why I argue that the Democratic Party, in refusing to offer genuine reforms that help the lower classes, have made this situation inevitable. The policies of the Clinton New Democrats are Republican lite: tough-on-crime policies, cutbacks on welfare, neoliberal attacks on the working class, war hawk mentality that is willing to sacrifice the lives of the poor for wars that defend corporate interests, bloated corporatist government that no longer serves the public good, etc.

Democrats have refused to offer an alternative that matters. Instead, they’ve offered policies that make miserable the lives of so many Americans, both whites and minorities. Yes, so have the Republicans, but no one expects right-wingers to fight for social justice and political reform. The desperate and disenfranchised have often turned to Republicans because they at least seem honest in what they are offering. Even for poor whites, telling tem that their lives are slightly less shitty than that of poor minorities is no comfort.

For the down-and-out American, facing poverty and unemployment isn’t an inspiring situation. Even finding work at McDonald’s or Walmart is not much of a sign of success and a hope for the future. An aspiring individual can make more selling drugs or prostituting themselves. It’s not like the good ol’ days when some poor, uneducated schmuck could start their own successful business or get a high paying entry-level factory job working his way up into management. These days, a working class job is a dead end and a mark of shame, not an opportunity to better oneself and one’s children.

It is unsurprising that large numbers of people in poverty turn to military service. The military offers these people discarded by society a vision of greatness and pride. They can be part of the most powerful military in the world. For too many Americans, this is the only realistic path of mobility left.

This is why conservatives praise the military and put it at the center of their political narrative. The good liberals of the Democratic Party, on the other hand, offer the poor jack shit.

Here is the passage from Graeber’s essay:

“America, of course, continues to see itself as a land of opportunity, and certainly from the perspective of an immigrant from Haiti or Bangladesh it is. But America has always been a country built on the promise of unlimited upward mobility. The working-class condition has been traditionally seen as a way station, as something one’s family passes through on the road to something else. Abraham Lincoln used to stress that what made American democracy possible was the absence of a class of permanent wage laborers. In Lincoln’s day, the ideal was that wage laborers would eventually save up enough money to build a better life: if nothing else, to buy some land and become a homesteader on the frontier.

“The point is not how accurate this ideal was; the point is that most Americans have found the image plausible. Every time the road is perceived to be clogged, profound unrest ensues. The closing of the frontier led to bitter labor struggles, and over the course of the twentieth century, the steady and rapid expansion of the American university system could be seen as a kind of substitute. Particularly after World War II, huge resources were poured into expanding the higher education system, which grew extremely rapidly, and all this growth was promoted quite explicitly as a means of social mobility. This served during the Cold War as almost an implied social contract, not just offering a comfortable life to the working classes but holding out the chance that their children would not be working class themselves. The problem, of course, is that a higher education system cannot be expanded forever. At a certain point one ends up with a significant portion of the population unable to find work even remotely in line with their qualifications, who have every reason to be angry about their situation, and who also have access to the entire history of radical thought. By the late Sixties and early Seventies, the very point when the expansion of the university system hit a dead end, campuses were, predictably, exploding.

“What followed could be seen as a kind of settlement. Campus radicals were reabsorbed into the university but set to work largely at training children of the elite. As the cost of education has skyrocketed, financial aid has been cut back, and the prospect of social mobility through education–above all liberal arts education–has been rapidly diminished. The number of working-class students in major universities, which steadily grew until the Seventies, has now been declining for decades. […]

“Why do working-class Bush voters tend to resent intellectuals more than they do the rich? It seems to me that the answer is simple. They can imagine a scenario in which they might become rich but cannot possibly imagine one in which they, or any of their children, would become members of the intelligentsia. If you think about it, this is not an unreasonable assessment. A mechanic from Nebraska knows it is highly unlikely that his son or daughter will ever become an Enron executive. But it is possible. There is virtually no chance, however, that his child, no matter how talented, will ever become an international human-rights lawyer or a drama critic for the New York Times. Here we need to remember not just the changes in higher education but also the role of unpaid, or effectively unpaid, internships. It has become a fact of life in the United States that if one chooses a career for any reason other than the salary, for the first year or two one will not be paid. This is certainly true if one wishes to be involved in altruistic pursuits: say, to join the world of charities, or NGOs, or to become a political activist. But it is equally true if one wants to pursue values like Beauty or Truth: to become part of the world of books, or the art world, or an investigative reporter. The custom effectively seals off such a career for any poor student who actually does attain a liberal arts education. Such structures of exclusion had always existed, of course, especially at the top, but in recent decades fences have become fortresses.

“If that mechanic’s daughter wishes to pursue something higher, more noble, for a career, what options does she really have? Likely just two: She can seek employment at her local church, which is hard to get. Or she can join the army.

“This is, of course, the secret of nobility. To be noble is to be generous, high-minded, altruistic, to pursue higher forms of value. But it is also to be able to do so because one does not really have to think too much about money. This is precisely what our soldiers are doing when they give free dental examinations to villagers: they are being paid (modestly, but adequately) to do good in the world. Seen in this light, it is also easier to see what really happened at universities in the wake of the 1960s–the “settlement” I mentioned above. Campus radicals set out to create a new society that destroyed the distinction between egoism and altruism, value and values. It did not work out, but they were, effectively, offered a kind of compensation: the privilege to use the university system to create lives that did so, in their own little way, to be supported in one’s material needs while pursuing virtue, truth, and beauty, and, above all, to pass that privilege on to their own children. One cannot blame them for accepting the offer. But neither can one blame the rest of the country for hating them for it. Not because they reject the project: as I say, this is what America is all about. As I always tell activists engaged in the peace movement and counter-recruitment campaigns: why do working-class kids join the army anyway? Because, like any teenager, they want to escape the world of tedious work and meaningless consumerism, to live a life of adventure and camaraderie in which they believe they are doing something genuinely noble. They join the army because they want to be like you.”