The Moral Axis of the Axial Age

Where is the world heading and upon what might it all be revolving? One would be unwise to speculate too much or offer strong predictions, but it must be noted that there has been a general trend that is well-established. Over time, Americans have been moving further and further to the political ‘left’. The majority of Americans are strongly liberal and progressive on nearly every major issue — political, social, economic, environmental, etc. But this is also happening on the political ‘right’, even among the religious. It’s interesting that as the elite have often pushed the Overton window to the ‘right’, the political ‘right’ has generally gone ‘left’ in following the rest of the American population. The whole spectrum shifts leftward.

Only a minority of right-wingers have become increasingly extreme in the other direction. The problem is this small demographic, what I call the ‘Ferengi‘ (overlap of Fox News viewers, white Evangelicals, and Republicans), has had an outsized voice in the corporate media and an outsized influence in corporatist politics. This ideological shift, to a large extent, is a generational divide or rather an age-related gradation. Each generation becomes steadily more liberal and progressive, sometimes outright left-wing on certain issues compared to how issues were perceived in the past.

This conflict of views has less relevance in the Democratic Party but is quite stark in the Republican Party. It’s also seen among Evangelicals. Old Evangelicals, at least among whites, are part of the Ferengi extremists. But young Evangelicals identify with the ‘progressive’ label and support such things as same sex marriage while no longer seeing abortion as an important issue, much less feeling drawn to polticized religiosity. The Ferengi are opposite of the majority of Americans, often opposite of a large number of moderate Republicans and conservatives, and definitely opposite of the young.

Yet the Ferengi are held up as an equivalent demographic to these much larger demographics in creating a false narrative of polariztion and division. The ideological gap, though, is in some sense real. The Ferengi fringe are disproportionately represented among those who are most politically active with high voter turnout, specifically as found among older conservatives with more money and influence. Even as they are a shrinking minority, they still strongly control or otherwise are overly represented by the Republican Party and right-wing media. The extremism of this minority emphasizes how far ‘left’ the rest of the population has gone.

This ongoing leftward pattern, what some might consider ‘progress’, isn’t exactly new. The shift hasn’t only happened over the decades and across the generations but, one might argue, goes back centuries or possibly even millennia. Being part of the political ‘left’ project has required saintly patience, prophetic vision, and heroic will. The impulse of egalitarianism and universalism initially were religious imperatives — born under the Axial Age, grew into childhood during the Middle Ages, and came to young adulthood in the Enligthenment Age, if still not yet having reached full maturity.

It was in the 1300s, when the moral vision of Jesus, as expressed in the orignal Christian creed, finally captured the populist imagination as something akin to class war and sociopolitical ideology. Some of those early proto-leftists sought to overthrow the hierarchy of fuedalism and church, to bring the equality of heaven down to earth. Their thinking on the matter was far from being rationally articulated as a coherent philosophy, but the demands they made were stated in no uncertain terms. They weren’t content with otherworldly ideals of rewards in the afterlife. Once imagined, those ideals inevitably became radical in their threat to worldly power.

Yet no one back then had any notion of a political ‘left’, per se. For most of the past two millennia, it remained a moral intuition bubbling up out of the collective psyche. Even so, it was a poweful moral intuition. Those peasants, in revolting, did rampage into the cities and killed more than a few of the elite. They nearly took the king hostage, although they weren’t quite sure what to do as the commoners had never previously gained the upper hand to that degree. It would require many more centuries for the dirty masses to figure out exactly what were their demands, what exactly did this moral intuition mean, but for damn sure it could not be denied and it would only grow stronger over time.

That ancient outrage of the commoners is what we have inherited. We’ve fancied it up with Englightenment thought and clothed it in modern respectability, while the political ‘right’ has sought to blame it on French Jacobins and postmodern neo-Marxists or whatever, but in essence it remains that crude beating heart of moral righteousness and divine judgment, the authority of God’s command brought down like a sledgehammer to level the towers of human pride, as with Jesus throwing the moneychangers out of the temple. It’s not an intellectual argument and so, in response to it, rationality is impotent. But equally impotent are the churchly claims of fundies and the delicate sensibilities of social conservatives.

Every single advance of society began as a never-before-thought idea that was imagined into existence but at first denied and attacked as heretical, dangerous, crazy, or impossible. So much of what has become established and normalized, so much of what even conservatives now accept and defend began as terrifying radicalism, fevered dream, and ranting jeremiad. Before written about in revolutionary pamphlets, scholarly tomes and left-wing analyses, these obstinate demands and unrealistic ideals were originally brought forth by prophets from the desert and peasants from the countryside, the uncouth and illiterate rabble who spoke with the moral certainty of faith and of God’s intimacy.

These initially incohate inklings and urgings of the Anglo-American moral imagination took so many unknown generations of struggle to take shape as we know them now. But we act like the revolutionary zeal of the late 18th century burst forth like Athena from Zeus’ head, as if intellectuals with too much time on their hands thought it all up while getting a bit tipsy in colonial taverns. More than a few of those rabblerousers and pamphlet scribblers began as religious dissenters, a tradition they inherited from their forefathers who fled to the colonies during the religious uprising and populist unrest of the English Civil War.

Thomas Paine, a man hated for claiming God was not an evil authoritarian ruling over humanity, has been largely forgotten in his later writing of Agrarian Justice in 1797. In offering a plan for land and tax reform, he spelled out ideas on an old age pension and basic income. The former took almost a century and half to finally get enacted as Social Security and the latter we’re still working toward. These kinds of radical proposals take a while to gain purchase in political action, even when they’ve been part of the political imaginary for many generations or longer. Paine himself was merely responding to an ongoing public debate that preceded him in the centuries before.

Even his criticisms of organized religion were largely his having repeated what others had already said. Some of those heretical thoughts were already being recorded in the ancient world. Jesus, after all, was one of the greatest heretics of them all, a detail that didn’t go without notice by so many heretics who followed his example. Such religious heresy always went hand in hand with political heresy. The early Christians were disliked because they refused to participate in poltical religion. And some of the first Christian communities set themselves apart by living in egalitarian communes where positions were decided through drawing lots. Their radical beliefs led to radical actions and radical social order.

So, it’s unsurprising that primitive communism, proto-socialism, and Marxist-like critiques began among religious dissenters, as heard during the Peasants’ Revolt and English Civil War. They took inspiration from Jesus and the original Christians, as those in the first century were themselves drawing upon the words written down over the half millennia before that. When the full-fledged socialists came along with their crazy dreams as implemented in Milwaukee’s sewer socialism, they were doing so as part of the Judeo-Christian tradition and in carrying forward ancient ideals.

Yet here we are. The radical notion of sewer socialism where everyone equally deserves clean water was once considered a threat to Western civilization by the respectable elite but now is considered an essential component of that very same ruling order. Conservatives no longer openly argue that poor people deserve to fall into horrific sickness and die from sewage and filthy water. What used to be radically left-wing has simply become the new unquestioned norm, the moral ground below which we won’t descend. Some might call that progress.

It’s the same thing with constitutional republicanism, civil rights, free markets, universal education, women’s suffrage, abolition of slavery, and on and on. In centuries past, these were dangerous notions to conservatives and traditionalists. They were condemned and violently suppressed. But now the modern right-winger has so fully embraced and become identified with this radicalism as to have forgotten it was ever radical. And this trend continues. As clean water is accepted as a universal right, in the near future, same sex marriage and basic income might be likewise brought into the fold of what defines civilization.

There is no reason to assume that this seismic shift that began so long ago is going to stop anytime soon, as long as this civilizational project continues its development. The aftershocks of an ancient cataclysm will likely continue to redefine the world from one age to the next. In a sense, we are still living in the Axial Age (“The Empire never ended!” PKD) and no one knows when it will finally come to a close nor what will be the final result, what world will have come to fruition from the seed that was planted in that fertile soil. The Axial Age is the moral axis upon which the world we know rotates. A revolution is a turning and returning, an eternal recurrence — and in a state of disorientation with no end in sight, around and around we go.

* * *

On the Cusp of Adulthood and Facing an Uncertain Future: What We Know About Gen Z So Far
by Kim Parker and Ruth Igielnik

Within the GOP, Gen Zers have sharp differences with their elders

Among Republicans and those who lean to the Republican Party, there are striking differences between Generation Z and older generations on social and political issues. In their views on race, Gen Z Republicans are more likely than older generations of Republicans to say blacks are treated less fairly than whites in the U.S. today. Fully 43% of Republican Gen Zers say this, compared with 30% of Millennial Republicans and roughly two-in-ten Gen X, Boomer and Silent Generation Republicans. Views are much more consistent across generations among Democrats and Democratic leaners.

Similarly, the youngest Republicans stand out in their views on the role of government and the causes of climate change. Gen Z Republicans are much more likely than older generations of Republicans to desire an increased government role in solving problems. About half (52%) of Republican Gen Zers say government should do more, compared with 38% of Millennials, 29% of Gen Xers and even smaller shares among older generations. And the youngest Republicans are less likely than their older counterparts to attribute the earth’s warming temperatures to natural patterns, as opposed to human activity (18% of Gen Z Republicans say this, compared with three-in-ten or more among older generations of Republicans).

Overall, members of Gen Z look similar to Millennials in their political preferences, particularly when it comes to the upcoming 2020 election. Among registered voters, a January Pew Research Center survey found that 61% of Gen Z voters (ages 18 to 23) said they were definitely or probably going to vote for the Democratic candidate for president in the 2020 election, while about a quarter (22%) said they were planning to vote for Trump. Millennial voters, similarly, were much more likely to say they plan to support a Democrat in November than Trump (58% vs. 25%). Larger shares of Gen X voters (37%), Boomers (44%) and Silents (53%) said they plan to support President Trump. […]

Generations differ in their familiarity and comfort with using gender-neutral pronouns

Ideas about gender identity are rapidly changing in the U.S., and Gen Z is at the front end of those changes. Gen Zers are much more likely than those in older generations to say they personally know someone who prefers to go by gender-neutral pronouns, with 35% saying so, compared with 25% of Millennials, 16% of Gen Xers, 12% of Boomers and just 7% of Silents. This generational pattern is evident among both Democrats and Republicans.

There are also stark generational differences in views of how gender options are presented on official documents. Gen Z is by far the most likely to say that when a form or online profile asks about a person’s gender it should include options other than “man” and “woman.” About six-in-ten Gen Zers (59%) say forms or online profiles should include additional gender options, compared with half of Millennials, about four-in-ten Gen Xers and Boomers (40% and 37%, respectively) and roughly a third of those in the Silent Generation (32%).

These views vary widely along partisan lines, and there are generational differences within each party coalition. But those differences are sharpest among Republicans: About four-in-ten Republican Gen Zers (41%) think forms should include additional gender options, compared with 27% of Republican Millennials, 17% of Gen Xers and Boomers and 16% of Silents. Among Democrats, half or more in all generations say this.

Gen Zers are similar to Millennials in their comfort with using gender-neutral pronouns. Both groups express somewhat higher levels of comfort than other generations, though generational differences on this question are fairly modest. Majorities of Gen Zers and Millennials say they would feel “very” or “somewhat” comfortable using a gender-neutral pronoun to refer to someone if asked to do so. By comparison, Gen Xers and Boomers are about evenly divided: About as many say they would feel at least somewhat comfortable (49% and 50%, respectively) as say they would be uncomfortable.

Members of Gen Z are also similar to Millennials in their views on society’s acceptance of those who do not identify as a man or a woman. Roughly half of Gen Zers (50%) and Millennials (47%) think that society is not accepting enough of these individuals. Smaller shares of Gen Xers (39%), Boomers (36%) and those in the Silent Generation (32%) say the same.

Here again there are large partisan gaps, and Gen Z Republicans stand apart from other generations of Republicans in their views. About three-in-ten Republican Gen Zers (28%) say that society is not accepting enough of people who don’t identify as a man or woman, compared with two-in-ten Millennials, 15% of Gen Xers, 13% of Boomers and 11% of Silents. Democrats’ views are nearly uniform across generations in saying that society is not accepting enough of people who don’t identify as a man or a woman.

“…that my children may have peace.”

Apartheid in South Africa was a violently oppressive system, one among many in history. And whenever there is oppression, there are always those who resist and fight against it. A once less well known freedom fighter is Tim Jenkin, now more well known because of a recent movie adaptation of his 1979 escape, along with two other political prisoners, from the Pretoria prison. It’s an inspiring tale of moral victory, a rare case where the persecuted individual gains his own release on his own terms and helps defeat injustice.

Along with a compatriot, he was arrested for setting off “leaflet bombs”. They were designed not to hurt people but to disseminate illegal literature in public areas. The purpose was to spread the message of moral struggle, to let the oppressed know they were not alone and to inform the oppressors that they would not be silenced. Having set off many of these devices, he was given a 12 year sentence and the other man 8 years. It was a punishment that might not have been so much for claims of terrorism as for being judged a race traitor and an enemy of the state.

From the moment he entered prison he schemed about escape. The guy obviously is a genius. If you didn’t know his escape actually happened, you’d think a story about it was contrived in the seeming impossibility of it. With the help of other prisoners, he spent years studying the structure of the prison, the mechanism of locks, and the patterns of the guards’ behavior. He used what limited resources they had access to in order to construct tools to defeat the system. The audacity of it was inspiring alone. Even in getting through dozens of locked doors, each with different keys, they still faced a sniper on the prison walls who would shoot on sight. It demonstrates how good fortune favors the prepared and the brave.

For all the good feeling that comes from a prison escape movie, it also reminds one of how much brilliance gets wasted in this world we are born into. For years, Jenkin used his talents to struggle against Apartheid and then, after caught, to escape. Imagine, in that same time period, what he could have accomplished if he had grown up in a free society and his mind had been set toward scientific discovery, technological innovation, medical cures, or simply public service. There is nothing wrong with dedicating one’s life to political activism and defiance of moral wrong, but one suspects he didn’t dream of that profession as a child.

Think of the American Founders. They weren’t raised to be revolutionaries nor was it what they aspired to. By an accident of fate, they found themselves in a struggle for freedom and liberty. Yet by interest and talent, many of them preferred to spend their free time committed to scientific experimentation and technological invention. Even in their politics, they weren’t out to destroy the old world but were inspired to build something new. If their situation had been different, Thomas Jefferson might now be remembered for having invented a swivel chair and Thomas Paine for designing an iron bridge.

“The science of government,” wrote John Adams, “it is my duty to study, more than all other sciences; the arts of legislation and administration and negotiation ought to take the place of, indeed exclude, in a manner, all other arts. I must study politics and war, that our sons may have liberty to study mathematics and philosophy. Our sons ought to study mathematics and philosophy, geography, natural history and naval architecture, navigation, commerce and agriculture in order to give their children a right to study painting, poetry, music, architecture, statuary, tapestry and porcelain.”

Paine admitted, “That there are men in all countries to whom a state of war is a mine of wealth, is a fact never to be doubted.” And such men were unwilling to assent to the independence of others. But elsewhere in The American Crisis, he stated that in the American colonies they came to the fight reluctantly, if with courageous resolve in the final measure. Peace, though it be desired, was not offered by a military empire that demanded submission or subjugation. Knowing the high cost of what defeat would entail, it was agreed that, “If there must be trouble, let it be in my day, that my children may have peace.” The ultimate aim remained peace — if not for one generation, then for the next.

Revolution was not an end to itself. Struggle was not its own reward that built character and uplifted the spirit. Overthrowing oppression was simply the work that had to be done to make possible a good society where the following generations could do something better with their time. In a world maybe not so different, instead of the slavery and indentured servitude of colonial imperialism, we of the present living generation face a banana republic and capitalist realism, lesser evilism and bullshit jobs. The human potential lost, the raw talent and capacity corrupted — the immensity of all that goes to waste.

We are kept so busy, endlessly preoccupied and stressed, that we have little time and energy left to seek something better, either for ourselves or our children and grandchildren. The few of us scheming for escape, rarely catch our breath long enough to dream about what we might do once no longer trapped in this Black Iron Prison, what might follow after. Struggle has come to define our existence and constrain our moral imagination. We need to remind ourselves of what we are hoping to accomplish, what kind of just and worthy society we wish to gift to the coming generations, what kind of peace they might have.

Ancient Outrage of the Commoners

For you are all children of God in the Spirit.
There is no Jew or Greek;
There is no slave or free;
There is no male and female.
For you are all one in the Spirit.

Based on Galations 3:28, Stephen J. Patterson, The Forgotten Creed

When Adam delved and Eve span, Who was then the gentleman? From the beginning all men by nature were created alike, and our bondage or servitude came in by the unjust oppression of naughty men. For if God would have had any bondmen from the beginning, he would have appointed who should be bond, and who free. And therefore I exhort you to consider that now the time is come, appointed to us by God, in which ye may (if ye will) cast off the yoke of bondage, and recover liberty.

John Ball, 1381 sermon at Blackheath, after his release from prison during the Peasants’ Revolt

Who rightly claims the beating heart of moral imagination, of radical vision? Who feels the throbbing pulse of it in their veins, the flow of vitality in the body politic? Who bleeds when the cut goes deep, who cries out in pain, and who hears it? Who knows this wounding, what it means, what is to be gained? Who holds to this sacrifice and accepts its cost?

Within the broad liberalism of the Enlightenment, there was the reactionary strain of proto-conservative ‘classical liberalism’, as given voice by the Englishman John Locke (1632-1704) who gets more credit than he deserves. This early on was preempted by pre-Enlightenment religious dissenters and countered by radical Enlightenment thinkers, from Roger Williams (1603-1683) in the American colonies to Baruch Spinoza (1632-1677) in Netherlands. We should acknowledge that Williams advocated Lockean-like land rights before Locke and did so with stronger moral force and egalitarian vision. And we should look to Spinoza, godfather of the radical Enlightenment, as a probable influence on Locke who lived in the same city when Spinoza was published, whereas Locke’s own writings only came later.

The more well known and respectable figures of the Enlightenment were being carried by the swift currents that surrounded them and preceded them, going back many centuries. The British Enlightenment didn’t come out of nowhere nor did the radical tradition of revolutionary zeal. Some consider the 17th century English Civil War, or else 14th century English Peasants’ Revolt, to be the first modern revolution based on class conflict (in the same period of disruption, there is also the 16th century German Peasants’ War led by Thomas Müntzer; see The War of the Poor by Eric Vuillard). In this largely religious milieu, there arose thought that was proto-liberal, proto-progressive, proto-democratic, proto-libertarian, proto-socialist, and proto-anarchist. But if nothing else, it was often radically populist and egalitarian.

Before the Protestant Reformation, there was the extreme hereticism of another Englishman, John Wycliffe (1320s-1384), who declared all were equal before God, everyone could know God for themselves, and Scripture was accessable to all. He also thought the clergy should take a vow of poverty or maybe be entirely abolished, and he went further in his radical belief that slavery is a sin. The entirety of feudalism was up for doubt and denial. Priests, lords, and kings had no rightful claim over the people and so freedom of the people could be reclaimed. What was to be rendered unto Caesar was not much at all. Such egalitarian righteousness would inspire the Peasants’ Revolt, and it lit a radical fire within the English imagination that never again was quenched.

Following the spirit of that age, Englishman John Ball (1338-1381) as one of Wycliffe’s disciples preached a warning, “Things cannot go well in England, nor ever will, until all goods are held in common, and until there will be neither serfs not gentlemen, and we shall be equal.” This was a demand for equality, on earth as it was in heaven, in the living present as it was at the time of Creation. “At the beginning we were all created equal,” Ball stated with the confident certainty of faith. “If God willed that there should be serfs, he would have said so at the beginning of the world. We are formed in Christ’s likeness, and they treat us like animals.” These words rang true among the commoners for it could not be denied, reading Scripture for themselves, what was found in the Word of God.

The uprising that resulted, although brief, was at times violent in seeking justice and it sent the ruling order into disarray — presaging what was to come. All of that was a more threatening challenge to the legitimacy of hierarchical authority, religious and otherwise, than either Martin Luther (1483-1546) or John Calvin (1509-1564) in the following centuries. These earlier voices of scathing critique, native to England, might be why the Protestant Reformation didn’t have the same kind of impact there as it did in continental Europe, for religious dissent wended its own path on the British isles. The religious fervor, often expressed as economic populism and politcal diatribe, kept eruptng again and again across the centuries. It was the seed out of which the English Enlightenment bloomed and it bore the fruit of revolt and revolution.

There long had been something in the British character that bristled against a ruling elite, maybe the cultural memory of British paganism among commoners in southern England, the Anglo-Saxon gut-level sense of freedom and communal identity in East Anglia, and the Scandinavian stoic but fiercely held independence found across the Midlands, not to mention the longstanding influence of cultural autonomy and at times political defiance from the Welsh, Scots-Irish, Highland Scots, and Irish. In defense against various ruling elites, from Romans to Normans and onward, a tradition of populist resistance and cultural pride had taken hold among regional populations, bulwarks against foreign incursions and centralized control. Attempts to enforce imperial or national unity and identity, loyalty and subservience was a constant struggle and never fully successful.

That particularly fed into the English Civil War (or rather wars) with the rebellious Puritans, pre-pacifist Quakers, the anti-authoritarian Levellers, and the primitive communist Diggers (among many others) who, in addition, took inspiration from the English tradition of the Commons and the rights of the commoners, a precursor to the American Revolutionaries’ invocation of the rights of Englishmen. Indeed, following coup d’etat and regicide (a fine example for the French revolutionaries to follow), many of the religious dissenters escaped to the colonies where they left a permanent imprint upon American culture and the American mind, such as the popular practice of the jeremiad (Sacvan Bercovitch’s The American Jeremiad & David Howard-Pitney’s African American Jeremiad).

This incited the religious protest of those like Roger Williams, upon finding in the colonies yet more authoritarian rule and churchly dogmatism. But it wasn’t only the religious dissenters who left England for the New World. Many Royalists on the other side of the English Civil War headed to Virginia and, in the case of Thomas Morton (1579–1647) with Merry Mount, caused trouble further north in New England. The Puritans, for different reasons than their treatment of Williams and Anne Hutchinson (1591-1643) along with the troublesome Quakers, hated what they perceived as paganism in the uncouth behavior of Morton, a man from Merrie Old England in the Devon countryside with its paternalistic populism and rural folk tradition, along with the libertine influence of the London Inns of Court, but also a place of organizing for the dissenting Lollards following John Wycliffe.

There were many distinct varieties of English hereticism and rabblerousing, all of which clashed as part of a creative flux of free-wheeling public debate and often scandalous rebellion, giving fruit to social reform and political idealism. It’s true that John Locke was part of this mix as well. Though not necessarly an advocate of slavery, neither was he a strong critic and opponent. His patron put him in the position of writing the constitution of the Carolina Colony that upheld slavery. In that founding document of violent oligarchy, based on his belief that such social compacts were immutable and eternally binding, he wrote that “every part thereof, shall be and remains the sacred and unalterable form and rule of government, for Carolina forever.”

The oppressed slaves, indentured servants, destitute laborers, and evicted natives were seen as trapped in a permanent underclass, a caste of untouchables for impertuity, no generation henceforth having any moral claim for freedom or justice. Dissent was denied any validity and so tacit assent was assumed as an article of faith of the elite-sanctioned moral order. It was a hermetically-sealed philosophical dogmatism and ideological realism. Public debate was declared ended before it began. This was a constitutional spell as binding word magic.

“And to this I say,” declareth John Locke, arbiter of truth and reality, “that every man, that hath any possessions, or enjoyment, of any part of the dominions of any government, cloth thereby give his tacit consent, and is as far forth obliged to obedience to the laws of that government, during such enjoyment, as any one under it; whether this his possession be of land, to him and his heirs for ever, or a lodging only for a week; or whether it be barely travelling freely on the highway; and in effect, it reaches as far as the very being of any one within the territories of that government.”

This was not an auspicious beginning for this particular American experiment. Yet Lockean ideology is held up as an ideal and standard of political thought in the modern Anglo-American mind of the propertied self and privatized capitalism. Among many, he has been deemed and esteemed as a founding father of respectable thinking. Lockean rights impinges an absolute claim on who is welcomed into the circle of privilege and who is excluded by decree. Ths is a non-negotiable conviction and certitude, a newfound set of Holy Commandments brought down from the mountain top of the rational intellect by the enlightened aristocrat as divine prophet.

That was one element, sometimes referred to as ‘liberalism’, that was woven into American political thought and quite influential, unfortunately. Although never having traveled to, much less glimpsed, the New World, Locke boldly envisioned it as a place akin to Eden, as had many other Europeans — he wrote that, “In the beginning, all the world was America.” That meant it needed to be tamed and civilized with the Whiggish vision of patriarchal progress and imperialistic expansionism, white man’s burden and manifest destiny that was to bring forth moral order to the world through not only slavery but also genocide, conquest, and assimilation. This was the orignal liberal elitism, in all its paternalistic glory, that one hears so much about.

That authoritarian impulse remains strong within the ruling order and has informed the longstanding reactionary fear of populist uprising and democratic demand. With the imperialistic pseudo-Federalism of Alexander Hamilton and George Washington, the rot of authoritarianism had tainted the seed of freedom’s promise from the moment it was planted in the fertile soil of shared hope. Counter-revolution gained power within the American Revolution, which has ever since caused a conflation of anti-democratic rule with pseudo-democratic forms and rhetoric. This is how we’ve come to the point of a banana republic being called a democracy and used as proof of the failure of democracy by those who have done everything in their power to prevent and frustrate democratic aspiration at every turn.

In our collective miseducation and historical amnesia, many Americans can’t distinguish between democratic self-governance and anti-democratic oppression — causing moral injury, a schizoid wounding of the body politic resulting in spiritual malaise and psychosis. Even when diagnosis of what afflicts us rightly points to this unhealed trauma, the scapegoating of the democratic impulse becomes a sad charade that causes impotent despair and palsied cynicism. Amidst the throes of propagandistic infection, these fevered nightmares mock us in our still juvenile aspirations. We are meant to be shamed in our crippled and fumbling steps, to be trampled down by our own heavy hearts. We are judged guilty in the sin of our perceived weakness and failure. We are deemed as unworthy and stunted children who are a danger to ourselves and to society.

To assent to such a damning accusation is to accept defeat, but this demiurgic rule of lies fades before the revelation of ananmesis, a gnostic unforgetting. From religious dissent to radical rebellion, the burning ember of revolution within the Anglo-American soul never dies out, no matter how it gets mislabeled, misunderstood, and misdirected. The ancient outrage of the commoners, of the dirty masses and landless peasants, will continue to rise up like bile at the back of our throats, sometimes causing blind rage but at other times bursting forth with the clear sight of radical imagination and inspiration, commanding us to stand up in our full stature and stride confidently forward on a path that is set before us by an ancestral instinct, an awakening knowledge of the world before us.

We’ve been here before. It is familiar to us, the primal terrain of moral imagination, the living God whose indwelling authority speaks to us, we the living generation who carry forth a living vision, the ember setting tinder to flame to light our way. This is our inheritance, our birthright, the burden we carry and the hope that lifts us up. In the words that inspired revolution, “We have it in our power to begin the world over again.” Every moment we awaken to is pregnant with promise, something ready to be born in spirit and baptized in fire. The revelatory and revolutionary truth of the demos is all around us, for those with eyes to see. The liberation of the soul, the refutation of enslavement, the casting away of shackles — in this, we are already a free people, if only we would claim it, take hold of it and wrestle the dream down to the ground of our shared reality, in our common purpose, through our collective action.

No man can serve two masters: for either he will hate the one, and love the other; or else he will hold to the one, and despise the other. Ye cannot serve God and mammon.

Matthew 6:24

And Jesus went into the temple of God, and cast out all them that sold and bought in the temple, and overthrew the tables of the moneychangers, and the seats of them that sold doves,

Matthew 21:12

And again I say unto you, It is easier for a camel to go through the eye of a needle, than for a rich man to enter into the kingdom of God.

Matthew 19:24

Jesus said unto him, If thou wilt be perfect, go and sell that thou hast, and give to the poor, and thou shalt have treasure in heaven: and come and follow me.

Matthew 19:21

* * *

The above is a very loose and indirect response to what is found below — basically, modern democracy as an expression of the ancient demos, We the People, is not so easily dismissed:

Will Democracy’s Myths Doom Liberty?
by James Bovard

Americans are encouraged to believe that their vote on Election Day somehow miraculously guarantees that the subsequent ten thousand actions by the president, Congress, and federal agencies embody “the will of the people.” In reality, the more edicts a president issues, the less likely that his decrees will have any connection to popular preferences. It is even more doubtful that all the provisions of hefty legislative packages reflect majority support, considering the wheeling, dealing, and conniving prior to final passage. Or maybe the Holy Ghost of Democracy hovers over Capitol Hill to assure that average Americans truly want every provision on every page of bills that most representatives and senators do not even bother reading?

A bastard cousin of the “will of the people” flimflam is the notion that citizens and government are one and the same. President Franklin Roosevelt, after five years of expanding federal power as rapidly as possible, declared in 1938, “Let us never forget that government is ourselves and not an alien power over us.” President Johnson declared in 1964: “Government is not an enemy of the people. Government is the people themselves,” though it wasn’t “the people” whose lies sent tens of thousands of American conscripts to pointless deaths in Vietnam. President Bill Clinton declared in 1996, “The Government is just the people, acting together—just the people acting together.” But it wasn’t “the people acting together” that bombed Serbia, invaded Haiti, blockaded Iraq, or sent the tanks in at Waco.

President Barack Obama hit the theme at a 2015 Democratic fundraiser: “Our system only works when we realize that government is not some alien thing; government is not some conspiracy or plot; it’s not something to oppress you. Government is us in a democracy.” But it was not private citizens who, during Obama’s reign, issued more than half a million pages of proposed and final new regulations and notices in the Federal Register; made more than 10 million administrative rulings; tacitly took control of more than 500 million acres by designating them “national monuments”; and bombed seven foreign nations. The “government is the people” doctrine makes sense only if we assume citizens are masochists who secretly wish to have their lives blighted.

Presidents perennially echo the Declaration of Independence’s appeal to “the consent of the governed.” But political consent is gauged very differently than consent in other areas of life. The primary proof that Americans are not oppressed is that citizens cast more votes for one of the candidates who finagled his name onto the ballot. A politician can say or do almost anything to snare votes; after Election Day, citizens can do almost nothing to restrain winning politicians.

A 2017 survey by Rasmussen Reports found that only 23 percent of Americans believe that the federal government has “the consent of the governed.” Political consent is defined these days as rape was defined a generation or two ago: people consent to anything which they do not forcibly resist. Voters cannot complain about getting screwed after being enticed into a voting booth. Anyone who does not attempt to burn down city hall presumably consented to everything the mayor did. Anyone who does not jump the White House fence and try to storm into the Oval Office consents to all executive orders. Anyone who doesn’t firebomb the nearest federal office building consents to the latest edicts in the Federal Register. And if people do attack government facilities, then they are terrorists who can be justifiably killed or imprisoned forever.

In the short term, the most dangerous democratic delusion is that conducting an election makes government trustworthy again. Only 20 percent of Americans trust the government to “do the right thing” most of the time, according to a survey last month by the Pew Research Center. Americans are being encouraged to believe that merely changing the name of the occupant of the White House should restore faith in government.

Founding Visions of the Past and Progress

The foundng debate of Federalists and Anti-Federalists is always of interest. In perusng the writings, particularly the letters, of Benjamin Franklin, Thomas Jefferson, James Madison and Thomas Paine, there is a recurrent theme about land, property and taxation, which opens up onto a vista of other issues. The American colonists, before and after becoming revolutionaries, lived in a world of tumultuous change. And the world of their cultural kin in Europe, if far away across an ocean, presented a vision of further changes that posed a warning and felt too near for comfort.

From the beginning of the enclosure of the Commons centuries prior, the people’s relationship to the physical world around them had been the hinge upon which everything else turned. With the creation of landless peasants, there was the first waves of mass urbanization with industrialization and colonial imperialism soon following, not to mention social turmoil from the Peasant’s Revolt to the English Civil War. There was displacement as peasant villages were razed to the ground, creating a refugee crisis and mass movement of populations. What followed was widespread homelessness, poverty, starvation, malnutrition, disease, and death. That post-feudal crisis is what so many British had escaped in heading to the colonies beginning in the 1600s.

It was the greatest period of destabilization since the earlier wave of urbanization during post-Axial imperialism more than a millennia earlier. But in many ways it was simply a continuation of the long process of urbanization that was unleashed with the agricultural revolution at the dawn of civilization. The first agricultural people, often forming large villages and then city-states, went through a precipitous period of health decline, including regular plagues and famines. Even with advanced food systems and scientific-based healthcare, modern humans have still yet to regain the height, bone development, and brain size of paleolithic humans.

In the late 19th century heading into the next, moral panic took over American society as the majority became urbanized for the first time, something that had happened centuries earlier in Europe. There was the same pattern of worsening health that the American founders had previously seen in the burgeoning European cities. This stood out so clearly because early Americans, raised on rural life and food abundance with lots of wild game, were among the healthiest and tallest people in the Western world at the time. When those early Americans visited Europe, they towered over the locals. Some Europeans also noticed the changes in their own populations, such as one French writer stating that mental illness spread with the advance of civilization.

When Thomas Jefferson envisioned his agricultural ideal of the yeoman, he wasn’t merely proffering an ideological agenda of rural romanticism. He was a well-traveled man and he had seen many populations with which to compare. He worried that, if and when America became urbanized and industrialized, the same fate would await us. This was a prediction of the sense of societal decline that indeed took over not long after his death. Even before the American Civil War, there was the rise of large industrial cities with all of the culture war issues that have obsessed Americans ever since. But what is fascinating is that this worry and unease about modernity was such a pressing concern at the very foundation of the country.

In advocating for a democratic republic as did not exist in Europe where oppression and desperation prevailed, Jefferson wrote to Madison from Paris about his hopes for the new American experiment. There was the desire to learn from the mistakes of others and not repeat them. America held the promise of taking an entirely different path toward modernity and progress. He wrote that, in a letter to Madison (20 December 1787),

“After all, it is my principle that the will of the Majority should always prevail. If they approve the proposed Convention in all it’s parts, I shall concur in it chearfully, in hopes that they will amend it whenever they shall find it work wrong. I think our governments will remain virtuous for many centuries; as long as they are chiefly agricultural; and this will be as long as there shall be vacant lands in any part of America. When they get piled upon one another in large cities, as in Europe, they will become corrupt as in Europe. Above all things I hope the education of the common people will be attended to; convinced that on their good sense we may rely with the most security for the preservation of a due degree of liberty.”

Jefferson may have been an aristocrat, but he was a particular kind of rich white guy. Maybe it had to do with where he lived. The Virginia aristocracy, different than the Carolina aristocracy, lived and worked among their slaves year round. Slaves were spoken of as an extension of the family and the social structure was similar to the feudal villages that were quickly disappearing elsewhere. Jefferson became infamous for his close relations with his slaves with at least one supposedly having been his lover and several his children. He could play the role of an aristocrat, but he also knew how to associate with the poor, maybe why he was one of the only wealthy elite in the colonies who maintained a long-term relationship with the crude and low-class Thomas Paine.

When in France, Jefferson would dress in disguise to mingle among the dirt poor. He would visit with them, eat their food, and even sleep in their lice-infested beds. So, when he talked of the problems of poverty, he drew upon firsthand experience. The poverty of Europe went hand in hand with extreme concentration of wealth, land, and power. There were homeless and unemployed landless peasants living near the privatized Commons that had been turned into beautiful parks and hunting grounds for the wealthy elite. These poor were denied access to land to farm and upon which to hunt and gather, even as they went hungry and malnourished. This seemed like a horrific fate to the American mind where land and natural resources represented the foundation of freedom and liberty, the subsistence and economic independence that was necessary for self-governance and republican citizenship.

The same familiarity with the dirty masses was true of Benjamin Franklin who, like Paine, did not grow up in wealth. I doubt Franklin followed Jefferson’s example, but colonial life disallowed vast gulfs of class disparity, except in certain Deep South cities like Charleston where opulent wealth helped re-create more European-style urbanization and depradation. Most of the founders, including George Washington also of Virginia, were forced to live in close proximity not only the lower classes but also Native Americans. A number of the founders wrote high praise of the Native American lifestyle.

Franklin, a man who loved the good life of urban comfort, felt compelled to admit that Native American had a more natural, healthier, and happier way of living. He observed, as did many others, that Native Americans raised among the colonists often returned to their tribes the first chance they got, but Europeans raised among Native Americans rarely wanted to return to the dreary and oppressive burdens of colonial life. Observations were often made of the admirable examples of tribal freedom from oppression and republican self-governance, some of which was a direct inspiration to designing the new American government. This fed into the imagination of what was possible and desirable, the kinds of free societies that no European could have imagined existing prior to travel to the New World.

The American colonies, at the borderland of two worlds, was a place of stark contrasts. This drove home the vast differences of culture, social order, and economic systems. The agricultural colonies, to many early American thinkers, seemed like an optimal balance of rural health and the benefits of civilization, but it was also understood as a precarious balance that likely could not last. Someone like Jefferson hoped to restrain the worst elements of modernity while holding onto the best elements of the what came before. Other founders shared this aspiration and, with early American populations having been small, this aspiration didn’t appear unrealistic. Such a vast continent, argued Jefferson, could maintain an agricultural society for centuries. The forces of modernization, however, happened much more quickly than expected.

Nonetheless, this rural way of life held on longer in the South and it fed into the regional division that eventually split the nation in civil war. Southeners weren’t only fighting about racialized slavery but also fighting against what they perceived as the wage slavery of industrialization. Their fear wasn’t only of political dependence on a distant, centralized power but also fear of economic dependence on big biz, corporate capitalists, monied interests, and foreign investors. As an early indication of this mindset, Jefferson went so far as to advise including “restriction against monopolies” (equal to “protection against standing armies”) in the Bill of Rights, as it was understood that a private corporation like the British East India Company could be as oppressive and threatening as any government (letter to James Madison, 20 December 1787). In fact, corporations were sometimes referred to as governments or like governments. The rise of corporate capitalism and industrialized urbanization was seen with great trepidation.

This fear of urbanization, industrialization, and modernization has never gone away. We Americans still think in terms of the divide between the rural and urban. And in the South, to this day, fairly large populations remain in rural areas. The Jeffersonian vision of yeoman independence and liberty still resonates for many Americans. It remains powerful both in experience and in rhetoric. Also, this isn’t mere nostalgia. The destruction of the small family farm and rural farm communities was systematically enacted through government agricultural policies and subsidization of big ag. Jefferson’s American Dream didn’t die of natural causes but was murdered, such that mass industrialization took over even farming. That happened within living memory.

The consequences of that decision of political power has made America into a greater and more oppressive empire than the British Empire that the American colonists sought to free themselves from. Europe has fully come to America. The anxiety continues to mount, as American health continues to decline over the generations, such that public health is becoming a crisis. The American founders were never opposed to modern civilization, but maybe they were wise in speaking of moderation and balance, of slow and careful change, in order to protect the ancient Anglo-Saxon memory of strong communities, proud freedom and republican virtue. A healthy, civic-minded society is hard to create but easy to destroy. The Anti-Federalists, more than any others, perceived this threat and correctly predicted what would happen if their warnings were not heeded.

They believed that the worst outcomes were not inevitable. Compromise would be necessary and no society was perfect, but their sense of promise was inspired by a glimpse of social democracy, something they did not yet have a term for. Ironically, some European countries, specifically in Scandinavia, better maintained the small-scale social order and responsive governance that many of the American founders dreamed of. Those countries have better managed the transition into modernity, have better regulated and compensated for the costs of urban industrialization, and better protected the public good from private harm. The American Dream, in one of its original forms, is nearly extinct in America. And what has replaced it does not match the once inspiring ideals of the American Revolution. Yet the promise lingers in the hearts and minds of Americans. Moral imagination remains potent, long after much of what supported it has disappeared and the memory fades.

American Citizens of the World

Patriotism has been lower in recent decades and it continues a steady decline in the United States. It’s even lower in some other Western countries, such as Britain and Germany. And of course, patriotism drops lower and lower as one looks down the generations. The youngest Americans, going back many years, are split on the issue. Many of them have a positive opinion of other countries. It isn’t that most of them believe America is a bad place but that they see it as one decent country among many other worthy countries. It’s less of an us-vs-them attitude. It’s a sign that Cold War dogmatism is fading away, while the older generations die off to be replaced by those who have little to no memory of the prior century of ideological conflict and imperialistic hyper-nationalism.

Much of this has to do with each new generation having increasing rates of immigration, a trend beginning with Generation X. A large number of young Americans are immigrants, have immigrant parents, have traveled internationally, regularly interact with foreigners/foreign-born, follow international news, and are well educated. The young simply have more knowledge and experience of the world outside of the United States and that experience is personal and often positive. They are less likely to see foreign lands as scary places and foreign people as threatening. In general, increasing diversity contributes to a worldview of social liberalism, particularly for the generations that grow up in that diversity as a normal experience. It’s simply the process of Americans growing familiar with the larger world they are part of. We are finally fulfilling the revolutionary promise of our country’s founding, in slowly coming to an identity as citizens of the world, something more than a few of the American founders espoused.

To help understand this shift, it might be useful to study Joseph Heinrich’s The WEIRDest People in the World. Also, a historical perspective might be needed to understand what patriotism has meant across the generations and centuries. Americans are among the most WEIRD (Western, Educated, Industrialized, Rich, Democratic) and we have been the most patriotic of the WEIRD countries, often far more patriotic than non-WEIRD countries (particularly in Asia). That still majority-held patriotism, however declining, might seem odd in this context of a society born out of Enlightenment liberalism and revolutionary radicalism. We now associate patriotism with conservatism. But it’s easy to forget that large-scale nationalism was a modern invention. In its original form, the nation-state as an ideological power structure challenged the local authority of the feudal ruling elite. It helped overthrow the ancien regime and paved the way for democratic reforms and civil rights.

In the centuries prior to the rise of fascism and other regressive forms of statism, nationalism was linked to creating unity in divese multicultural societies. It sometimes was a force of tolerance, egalitarianism, and universalism. All citizens were equal, in theory. The modern nation-state emerged out of the colonial empires that had unleashed a mixing of populations like never before. Let’s look at the origins of American culture. Once a British government was created, a British identity had to be formed out of that immense mixture of people with their own separate cultural traditions (English, Irish, Scottish, Scots-Irish, Welsh, Palatine Germans, etc). One of the places that diversity took hold early on was in the American colonies, several of which had the ethnically English as a minority. The Pennsylvania Colony had so many non-English speakers that official announcements had to be printed in multiple languages. That diversity increased with mass immigration once America became a separate country. Without a shared tradition of ethnic culture, Americans too had to invent a new collective identity.

Patriotic nationalism was related to Whiggish progressivism that has a bit of a bad reputation now, but was extremely liberal for its time. The WEIRD forces, such as literacy, have typcally been associated with increasing liberalism and one might note that nationally-mandated public education was central to this process, often initially motivated by the radically new Protestant nation-states that promoted not only education and literacy but also individualism. A progressive impulse toward reform, in general, has long gone hand in hand with nationalism. Franklin Delano Roosevelt was a devout nationalist and the New Deal was part of the patriotic fervor that fed into the Cold War vision of America as a societal project not only in defense of freedom but also committed to social responsibility. The Civil Rights movement was able to succeed because it tapped into this increasingly powerful progressive nationalism that held up the WEIRD ideals of egalitaranism, fairness, and justice.

The context has changed over time. In the past, traditionalists, conservatives, and reactionaries often were critics of nationalism in upholding more local identities of kinship, ethnicity, religion, community, and regionalism — even to the point of seeking to secede by attacking the federal government to start the American Civil War because they saw nationalism as a threat to local power, authority, and identity. Now the ideological descendants of those anti-nationalists have become the strongest nationalists. That is common. Much of what is considered conservative today was once radically liberal. And as the political right embraces what liberals fought for in the past, this opens a space for liberals to push further into unknown ideological territory. So, conservatives today are more liberal than liberals were a century ago. That is how nationalism became normalized and, through revisionist history, became an invented tradition of nostalgia. It was treated as if it had always existed, the living memory of its origins having disappeared from the public mind.

So, even though patriotic nationalism was once a liberalizing force, as it became established, it has since often been seen as a reactionary force. The liberal impulse of WEIRD societies pushes toward ever larger collective identities. Nationalism used to serve that purpose of creating a shared liberal identity in a liberal society. But now nationalism has come to be used for xenophobic reasons in attacking, rather than in welcoming, immigrants. As each following young generation embraces ever more liberalism, progressivism and social democracy (even socialism as well), the WEIRD mentality grows stronger and the desire for a greater universal identity ever more takes hold. People don’t lose the desire for group belonging nor feel less loyalty, but the shared identites grow larger and more inclusive over time.

Patriotic nationalism is still holding strong and yet quickly weakening as something else appears on the horizon. We are living in the equivalent of the late Middle Ages when the enclosure movement eroded the foundation of feudalism, but few could imagine that it would be replaced or with what. Such societal transformations caused anxiety for some and hope for others. It’s maybe unsurprising that the younger generations who are the least patriotic are also the most optimistic about the future, as they embrace what is new. The Amercan identity has always been vague and amorphous. It was constantly shifting right from the start. Some generations forget this history, but it has a way of forcing its way back into the public mind and, in doing so, inspiring radical imagination. If nothing else is certain, to be American has meant adapting to change. Young Americans don’t hate America. They just have a different and maybe larger sense of what they love.

The Drugged Up Birth of Modernity

Below is a passage from a book I got for my birthday. I was skimming through this tome and came across a note from one of the later chapters. It discusses a theory about how new substances, caffeine and sugar, helped cause changes in mentality during colonialism, early modernity, and industrialization. I first came across a version of this theory back in the late ’90s or early Aughts, in a book I no longer own and haven’t been able to track down since.

So, it was nice coming across this brief summary with references. But in the other version, the argument was that these substances (including nicotine, cocaine, etc; along with a different kind of drug like opium) were central to the Enlightenment Age and the post-Enlightenment world, something only suggested by this author. This is a supporting theory for my larger theory on addictive substances, including some thoughts on how they replaced psychedelics, as written about previously: Sugar is an Addictive Drug, The Agricultural Mind, Diets and Systems, and “Yes, tea banished the fairies.”. It has to do with what has built the rigid boundaries of modern egoic consciousness and hyper-individualism. It was a revolution of the mind.

Many have made arguments along these lines. It’s not hard to make the connection. Diverse leading figures over history have observed the importance changes that followed along as these substances were introduced and spread. In recent years, this line of thought has been catching on. Michael Pollan came out with an audiobook about the role coffee has played, “Caffeine: How Coffee and Tea Created the Modern World.” I haven’t listened to it because it’s only available through Audible and I don’t do business with Amazon, but reviews of it and interviews with Pollan about it make it sound fascinating. Pollan has many thoughts about psychedelics as well, although I’m not sure if he has talked about psychedelics in relation to stimulants. Steven Johnson has also written and talked about this.

As a side note, there is also an interesting point that connects rising drug addiction with an earlier era of moral panic, specifically a crisis of identity. There was a then new category of disease called neurasthenia, as first described by George Miller Beard. It replaced earlier notions of ‘nostalgia’ and ‘nerves’. In many ways, neurasthenia could be thought of as some kind of variant of mood disorder with some overlap with depression. But a passage from another work, also included below, indicates that drug addiction was closely linked in this developing ideology about the diseased mind and crippled self. At that stage, the relationship wasn’t entirely clear. All that was understood was that, in a fatigued and deficient state, increasing numbers turned to drugs as a coping mechanism.

Drugs may have helped to build modern civilization. But then they quickly came to be taken as a threat. This concern was implicitly understood and sometimes overtly applied right from the beginning. With the colonial trade, laws were often quickly put in place to make sugar and coffee controlled substances. Sugar for a long time was only sold in pharmacies. And a number of fearful rulers tried to ban coffee for fear of it, not unlike how psychedelics were perceived in the 1960s. It’s not only that these substances were radicalizing and revolutionary within the mind and society as seen in retrospect. Many at the time realized these addictive and often stimulating drugs (and one might even call sugar a drug) were powerful substances right from the beginning. That is what made them such profitable commodities requiring an emergent militaristic capitalism that was violently brutal in fulfilling this demand with forced labor.

* * *

The WEIRDest People in the World:
How the West Became Psychologically Peculiar and Particularly Prosperous
by Joseph Henrich
Ch. 13 “Escape Velocity”, section “More Inventive?”
p. 289, note 58

People’s industriousness may have been bolstered by new beverages: sugar mixed into caffeinated drinks—tea and coffee. These products only began arriving in Europe in large quantities after 1500, when overseas trade began to dramatically expand. The consumption of sugar, for example, rose 20-fold between 1663 and 1775. By the 18th century, sugary caffeinated beverages were not only becoming part of the daily consumption of the urban middle class, but they were also spreading into the working class. We know from his famous diary that Samuel Pepys was savoring coffee by 1660. The ability of these beverages to deliver quick energy—glucose and caffeine—may have provided innovators, industrialists, and laborers, as well as those engaged in intellectual exchanges at cafés (as opposed to taverns), with an extra edge in self-control, mental acuity, and productivity. While sugar, coffee, and tea had long been used elsewhere, no one had previously adopted the practice of mixing sugar into caffeinated drinks (Hersh and Voth, 2009; Nunn and Qian, 2010). Psychologists have linked the ingestion of glucose to greater self-control, though the mechanism is a matter of debate (Beedie and Lane, 2012; Gailliot and Baumeister, 2007; Inzlicht and Schmeichel, 2012; Sanders et al., 2012). The anthropologist Sidney Mintz (1986, p. 85) suggested that sugar helped create the industrial working class, writing that “by provisioning, sating—and, indeed, drugging—farm and factory workers, [sugar] sharply reduced the overall cost of creating and reproducing the metropolitan proletariat.”

“Mania Americana”: Narcotic Addiction and Modernity in the United States, 1870-1920
by Timothy A. Hickman

One such observer was George Miller Beard, the well-known physician who gave the name neurasthenia to the age’s most representative neurological disorder. In 1871 Beard wrote that drug use “has greatly extended and multiplied with the progress of civilization, and especially in modern times.” He found that drug use had spread through “the discovery and invention of new varieties [of narcotic], or new modifications of old varieties.” Alongside technological and scientific progress, Beard found another cause for the growth of drug use in “the influence of commerce, by which the products of each clime became the property of all.” He thus felt that a new economic interconnectedness had increased both the knowledge and the availability of the world’s regionally specific intoxicants. He wrote that “the ancient civilizations knew only of home made varieties; the moderns are content with nothing less than all of the best that the world produces.” Beard blamed modern progress for increased drug use, and he identified technological innovation and economic interconnectedness as the essence of modernity. Those were, of course, two central contributors to the modern cultural crisis. As we shall see, many experts believed that this particular form of (narcotic) interconnectedness produced a condition of interdependence, that it quite literally reduced those on the receiving end from even a nominal state of independence to an abject dependence on these chemical products and their suppliers.

There was probably no more influential authority on the relationship between a physical condition and its historical moment than George Miller Beard. In 1878 Beard used the term “neurasthenia” to define the “lack of nerve strength” that he believed was “a functional nervous disease of modern, and largely, though not entirely, of American origin.” He had made his vision of modern America clear two years earlier, writing that “three great inventions-the printing press, the steam engine, and the telegraph, are peculiar to our modern civilization, and they give it a character for which there is no precedent.” The direct consequence of these technological developments was that “the methods and incitements of brain-work have multiplied far in excess of average cerebral developments.” Neurasthenia was therefore “a malady that has developed mainly during the last half century.” It was, in short, “the cry of the system struggling with its environment.” Beard’s diagnosis is familiar, but less well known is his belief that a “susceptibility to stimulants and narcotics and various drugs” was among neurasthenia’s most attention-worthy symptoms. The new sensitivity to narcotics was “as unprecedented a fact as the telegraph, the railway, or the telephone.” Beard’s claim suggests that narcotic use might fruitfully be set alongside other diseases of “overcivilization,” including suicide, premarital sex (for women), and homosexuality. As Dr. W. E Waugh wrote in 1894, the reasons for the emergence of the drug habit “are to be found in the conditions of modern life, and consist of the causative factors of suicide and insanity.” Waugh saw those afflictions as “the price we pay for our modern civilization.”24

Though Beard was most concerned with decreased tolerance-people seemed more vulnerable to intoxication and its side effects than they once were-he also worried that the changing modern environment exacerbated the development of the drug habit. Beard explained that a person whose nervous system had become “enfeebled” by the demands of modern society would naturally turn wherever he could for support, and thus “anything that gives ease, sedation, oblivion, such as chloral, chloroform, opium or alcohol, may be resorted to at first as an incident, and finally as a habit.” Not merely to overcome physical discomfort, but to obtain “the relief of exhaustion, deeper and more distressing than pain, do both men and women resort to the drug shop.” Neurasthenia was brought on “under the press and stimulus of the telegraph and railway,” and Beard believed that it provided “the philosophy of many cases of opium or alcohol inebriety.”25

* * *

Also see:

The Age of Intoxication
by Benjamin Breen

Drugs, Labor and Colonial Expansion
ed. by William Jankowiak and Daniel Bradburd

How psychoactive drugs shape human culture
by Greg Wadley

Under the influence
by Ed Lake

The Enlightenment: Psychoactive Globalisation
from The Pendulum of Psychoactive Drug Use

Tea Tuesdays: How Tea + Sugar Reshaped The British Empire
by Maria Godoy

Some Notes On Sugar and the Evolution of Industrial Capitalism
by Peter Machen

Coffee, Tea and Colonialism
from The Wilson Quarterly

From Beer to Caffeine: The Birth of Innovation
by Peter Diamandis

How caffeine changed the world
by Colleen Walsh

The War On Coffee
by Adam Gopnik

Coffee: The drink of the enlightenment
by Jane Louise Kandur

Coffee and the Enlightenment
by Stephen Hicks

Coffee Enlightenment? – Does drinking my morning coffee lead to enlightenment?
from Coffee Enlightenment

The Enlightenment Coffeehouses
by David Gurteen

How Caffeine Accelerated The Scientific Enlightenment
by Drew Dennis

How Cafe Culture Helped Make Good Ideas Happen
from All Things Considered

Coffee & the Age of Reason (17th Century)
from The Coffee Brewers

Philosophers Drinking Coffee: The Excessive Habits of Kant, Voltaire & Kierkegaard
by Colin Marshall

Coffee Cultivation and Exchange, 1400-1800
from University of California, Santa Cruz

Real Issues Behind Regressive Identity Politics

Here is a quickie. Jordan Peterson likes to oversimplify things with easy answers, as that is what his audience wants. He mixes genuine information with misinformaton and misinterpretaton. Then too often exaggerates something into a caricature of moral absolutism, such as about social roles in terms of his defense of patriarchy, class hierarchy, and race realism. It’s all about the lobsters or some such thing.

He does this with gender all the time, in treating it as a clear demarcation. The reality, as always, is much more complicated, even on the biological level: “It’s far from uncommon for people to carry genetics of both sexes, even multiple DNA” (Is the Tide Starting to Turn on Genetics and Culture?). Mixed genitalia is far from uncommon as well, although in the past doctors would have done gender assignment to babies to ensure they conformed to perceived biological norms. Here is a typical example of a strong view from Peterson:

“And the biggest sex differences that we know of that aren’t morphological are in interest. So women are more interested in people, by and large, and men are more interested in things, by and large. And the difference is actually large, it’s one standard deviation. And so that means if you’re a man, you would have to be more interested in people than 85% of men to be as interested as 50% of women. And if you’re a woman, you’d have to be more interested in things than 85% of women to be as interested as the 50th percentile male. So the difference is actually quite substantial, and it’s certainly large enough to drive occupational choice differences, which it does” (Jordan Peterson, Christina Hoff Sommers, and Danielle Crittenden (The Femsplainers)Full Transcript).

Peterson will use this as a rationalization for gender disparity in careers, such as low level of women in STEM fields. Yet many convincingly argue that some of this is cultural. Consider that in India and Latin America, women are the majority in the tech industry, the career that we think of as being the most male-centric in the United States. By the way, many other countries also see greater number of women in leadership positions, such as presidents and prime ministers.

He will sometimes vary his emphasis by saying that men prefer ideas while women people and relationships (Jordan Peterson, On the Differences Between Men and Women). Yet women are much higher achievers in education. Most college students are women and they outnumber men in grad school. Then they come out with 57% of the bachelor’s degrees, 60% of the master’s degrees, and 52% of the doctorates. Women dominate 7 out of 11 areas of study, including tough fields like biological science and medical science. That doesn’t indicate a gender difference crippling women’s interest in ideas and the ability to work with ideas.

Consider one of Peterson’s favorite topics, the thought of Carl Jung. He talks a lot about archetypes, if in such simplistic ways that Jung is rolling in his grave. One way he’ll talk about gender differences is in terms of personality. So, let’s go with Jungian typology, as seen in the data collected through the Myers-Briggs test.

There is only one area that shows a minor gender divide. Most dominant Thinking types are men and most dominant Feeling types are women. It’s a difference between how one makes decisions, whether through objective reasoning or by subjective values. It’s a bit compliciated, though, since Jung held that for introverts the opposing function would be more apparent outwardly. So, the introverted Feeling type would tend to deal with the world through extraverted Feeling, the latter being what is sometimes referred to as the aspirational function.

Complexities aside, the data shows that this gender divide does not apply to 30-40% of the population, at least in this country. Yet Peterson is ready to build entire gender stereotypes that should be used to socially construct the moral order that upholds gender roles, based on an assumption of genetic determinism and essentialism. It’s amazingly naive. It’s not to dismiss the importance of biology, but we have no idea how much of this difference is shaped by genetics vs environment and epigenetics. He is simply assuming that humans are mere puppets of their genetic fate, that culture and history have no great relevance in shaping our shared conditions. Even if that were true, what about the 30-40% who are by nature contrary to conventional expectations and norms of conservative ideology?

As a male Feeling type, according to the official Myers-Briggs, nothing Peterson says resonates with me about how men are supposed to be. I’m not even sure most male Thinking types would be all that persuaded either. His audience is a very narrow selction of males who identify with or aspire to his ideologically-driven masculine ideal. As a minority group even within the WEIRDest of WEIRD populations found primarily in North America, these Peterson true believers aren’t likely representative of most men in the world. That isn’t to say this group is insignificant in their sense of alienation, frustration and outrage, as I’d suggest they are canaries in the coal mine.

For certain, I don’t entirely disregard conservative concerns about gender, specifically problems with boys (The Boy Crisis). There is an argument to be made that some neurocognitive conditions, such as autism and ADHD, are extreme expressions of otherwise normal masculine attributes that no longer are deemed socially desirable in our society, specifically in schools. Others have noted boys are physically and cognitively maturing later than prior generations, as girls are maturing earlier. This stunting and growing gap might be caused by hormones and hormone mimics in the industrial diet and packaging. Whatever the cause, it sheds light on why women have suddenly come to dominate higher education.

It’s intriguing, actually, the changes that have happened. As a sign of something gone wrong, there has been a continuous decline of sperm counts, testosterone levels, and musculoskeletal strength over the generations, specifically in the United States and Western world. Some data indicates this goes back to the early 20th century when measuremens were first taken, but the trend likely began in the prior century. This change is dramatic. And it’s being felt on a personal level. Young men admit to feeling conflicted with the social expectations of being masculine, as it simply doesn’t match their own experience. The average man is just not feeling all that manly these days. And those who feel like inferior beta males can be drawn to self-help gurus like Peterson who promise to make real men out of them.

All of this is fair debate to be had, but let’s quit with the stereotypes already and allow for nuance. And it’s far from a new debate (The Crisis of Identity, Moral Panic and Physical Degeneration, Old Debates Forgotten, & Rate of Moral Panic). Going back to the late 1800s, there was a rising concern of boys becoming effeminate and men being emasculated. That was around the time the industrial diet began taking over American society. At first, it was an increase of starchy carbs and added sugar, but soon after seed oils replaced animal fats like butter and lard. And who knows what chemicals were being used in early canning and such. Actually, the concern about shifting gender roles goes further back to before the American Civil War. Besides diet, there were many other things going on. Indusrialization, of course, went hand in hand with urbanization that in a short period of time became mass urbanization with most Americans urbanized by the dawn of the 20th century.

Reactionaries arose to try to re-enforce what they thought were divine-ordained gender roles based on nostalgia about rural life and they did so in ways that were clumsy and oppressive. But that isn’t to deny something odd was and still is going on. That is why reactionaries continue to hold sway. For all their foolishness, they are pointing to real issues and occasionally they do bring up genuine information to be taken seriously. Peterson wouldn’t be so popular if he was entirely full of shit. He is speaking to what many others are feeling, even as he distorts what it all means with regressive white male identity politics. If we ignore or dismiss the reactionaries now without responding to what made them turn reactionary in the first place, the persuasive pull of the reactionary mind will only become more powerful.

* * *

They select different degree programmes: Are women and men born with different interests?
by Rasmus Friis

A disappointing response

According to Christian Gerlach, professor of cognitive neuroscience at the University of Southern Denmark, the answer is a bit disappointing. As it is actually impossible to say with certainty whether it is biological or social conditions that get men to apply for, say, IT and engineering subjects, and women to choose, say, subjects in the healthcare sector.

This is mostly due to the fact that it is hard to carry out experiments that clearly delineate cause and effect. You can’t just change the gender of your subjects, turn men into women and so on, and then find out what effect it has.

»We can’t control all the variables, which means that it is extremely difficult to find causality. It turns into guesswork when you have to say whether one factor or another is decisive. This is the fundamental problem.«

Gerlach is sceptical about the robust interpretations made by Jordan Peterson and the authors of the article.

He is particularly sceptical of the explanations that point to biology as a decisive factor behind the genders’ different interests.

»It has been incredibly difficult to associate complex patterns of thinking and acting to biological things like hormones,« says Christian Gerlach.

»I myself have a background in the biological part of psychology, so I should be open to the fact that you can explain a lot of these differences biologically. But I don’t personally think so. I think it has more to do with socialisation.«

Difference between Jude and Judith

He says that we affect each other in subtle ways and he mentions an experiment:

A baby sits on a carpet in a laboratory with several different toys in front of it. The researcher invites a test subject into the laboratory and asks the person to keep an eye on the baby while the researcher goes outside the door.

The researcher indicates each time whether it is a boy or a girl, and this is precisely what turns out to be decisive for the experiment.

If the experimental subjects think they are taking care of a girl, they tend to give the baby a doll or another toy that we consider feminine. If the subjects think that they are taking care of a boy, they will more often give the baby a toy car or something they consider masculine. A variation of the experiment can be seen in this BBC video.

Conclusion: There’s a difference between being named Jude and Judith.

»When you ask the subjects afterwards, it is clear that they have not done it consciously. It is an example of how this works slightly outside our field of attention,« says Christian Gerlach.

You can also find studies that support the opposite hypothesis, however. Researchers have, in a couple of experiments, showed that monkeys also prefer the toys that many humans would connect with their gender. Male apes, for example, choose to play with cars rather than dolls.

One of the researchers behind the first experiment, Gerianne Alexander, said to New Scientist that you should be careful about over-interpreting the results. But she added:

»It is probable that there is a biological tendency, that is then amplified by society.«

You’re Not The Man Your Father Was
by Neil Howe

Studies show that men’s testosterone levels have been declining for decades. The most prominent, a 2007 study in the Journal of Clinical Endocrinology and Metabolism, revealed a “substantial” drop in U.S. men’s testosterone levels since the 1980s, with average levels declining by about 1% per year. This means, for example, that a 60-year-old man in 2004 had testosterone levels 17% lower than those of a 60-year-old in 1987. Another study of Danish men produced similar findings, with double-digit declines among men born in the 1960s compared to those born in the 1920s.

The challenges to men’s health don’t end there. Rates of certain reproductive disorders (like testicular cancer) have risen over time, while multiple European studies have found that sperm counts are sinking. These trends coincide with a decline in musculoskeletal strength among young men: In a 2016 study, the average 20- to 34-year-old man could apply 98 pounds of force with a right-handed grip, down from 117 pounds by a man of the same age in 1985. Though grip strength isn’t necessarily a proxy for overall fitness, it’s a strong predictor of future mortality. […]

What’s happening to men physically dovetails with a broader story of social transformation. The economy is shifting away from jobs that favor men, like manufacturing, and toward sectors dominated by women. Young men have fallen behind women in educational attainment. They’re increasingly dropping out of the workforce and expressing less work centrality. The anxiety over the state of men mirrors a bigger debate over America’s national identity. Americans have traditionally seen themselves as a “pro-testosterone” nation: restless, striving, and rowdy. Yet in his new book The Complacent Class, Tyler Cowen argues that America is losing the dynamism, mobility, and enterprise that made it special. This anxiety may have even led the old-fashioned, overtly macho President Trump to victory.

The confusion over what masculinity means today is reflected in the conflicted feelings of males now coming of age. Most American Millennial men report feeling pressured to project a traditional image of manhood characterized by traits like toughness, self-reliance, and hypersexuality—but when asked if they wish to emulate these characteristics themselves, the majority don’t. A separate survey asked men to rate themselves on a scale of “completely masculine” to “completely feminine.” Only 30% of 18- to 29-year-olds chose “completely masculine.” That’s compared to 65% of men over 65.

The Insanity of Gaslighting Ourselves

“Two plus two will never make five. That’s not the problem. And George Orwell at the end, Winston’s being tortured, and he’s made to say two plus two equals five, and this totalitarianism makes us all lie. [Hannah Arendt] said that’s not the power. It’s the fact that in a world where people are going to say it is even when they know it isn’t. That is deeply estranging. That’s what creates those conditions of loneliness and despair. That, for her, is the wickedness of the political lie. People don’t believe that two plus two makes five. They don’t believe half of what’s said.”
~Lyndsey Stonebridge, The Moral World in Dark Times: Hannah Arendt for Now

All the time, people say what they don’t mean. Or else it’s not clear what they mean, the actual significance behind their words, what is most fundamentally motivating them. This duplicity has become the normal way of relating. And this is the basis of the identities that possess us. A simple example of this is the often heard statement that American voters get what they want. That is like saying enslaved Africans were happy with their violent oppression. There are endless other examples, and the most powerful are those we don’t notice in ourselves.

Such statements are patently absurd when taken at face value. And one suspects that, at some level, most people know that they are not true, even as they say them. It’s not what they really believe, but it obviously satisfies some need or purpose. Our modern society is filled with such bald-faced contradictions to what we know and feel. This is part of social control. It’s one thing to gaslight others but it’s even more powerful to get them to gaslight themselves. It’s the ultimate betrayal.

Once someone is psychotically disconnected from a direct and personal sense of reality, they become schizoid and compliant. Without grounding in the world beyond rhetoric, people become vulnerable to those who control and manage public perception. Backfire effect plays its role in ideological defense, but that isn’t the underlying force at play. Once identity is solidified, people will defend it on their own by various means. That leaves the issue of how did that ideological identity take shape in the first place.

This is true of all of us. There is something profoundly disorienting and alienating about modern society. We lose the ability to discern what is of genuine value. So, we turn to narratives to offer us an illusion of certainty. Then the reality in front of us is no longer compelling. We believe what we are told. Then we repeat it so often that we forgot it was told to us. Because it confirms and is confirmed by ‘mainstream’ mediated reality, we sound perfectly reasonable and are taken seriously by others.

This isn’t a mere individual condition. It’s collective insanity. Yet the spell it has over us dissipates with awareness. But few will accept this awareness, quickly retreating to the safety of the group mind, of ideological realism. People briefly awaken all the time, only to fall back to sleep, drawing the covers back over their head. Being reoriented to a deeper truth can feel disorienting, in awakening from a deep slumber. It’s hard to grapple with what to do with this knowing. Yet it’s near impossible to unsee after being jolted fully awake. It’s not a state one would choose, to be on the outside of the social world into which one was born.

It forces one to relearn what it means to be human. Rather than being enlightnened with clarity, it’s to become aware that one is in a state of epistemological infancy. Everything one thought one knew is now under question and doubt. A different way of listening is required, to sense what is real and true, what is valid and meaningful. Relating well to others becomes even more challenging. Just because one has glimpsed some flickering light through the rhetoric doesn’t instantly dispel the shadows. The tape loops go on playing in one’s mind, in response to the tape loops in the minds of others. In not knowing how to respond, one increasingly finds oneself falling into silence.

Sometimes a non-response is the only response that remains. In a world full of words, one ever more notices the silence in between, what is not said. All that can be done is to witness it, to hear what others have refused to hear, particulary what they have denied in themselves. One holds a space open. One waits. One listens. In doing so, another voice is heard, only a whisper at first. It’s a different kind of voice, a different kind of self. There is an intimacy in how it speaks. We are not alone and isolated. We are not alienated.  The truth was never destroyed. Reality remains.

Will bone broth scum cause cancer and impotence?

There is heated debate about bone broth ‘scum’, the bits and foam that float to the top in simmering. What is this disgusting stuff? And if consumed, how quickly will it kill you? It’s mostly coagulated protein and, bad reputation aside, it’s probably of no great concern (Curiosities: When boiling meat, what causes foam on liquid?), even as it gets referred to it as ‘impurities’. Some are religious in skimming it off because there must be something bad about it. Others don’t see the point. 

The original purpose for this practice might have been because, back in the good ol’ days, meat often had insects and larvae on it which floated to the top when added to a heated liquid. But at this point, it’s just a tradition passed down the generations. People do it because their momma did it that way and she did it because of her own momma before her. Then people come up with rationalizations for why they continue doing so long after the original reason was forgotten.

As for the coagulated protein, it doesn’t have a lot of flavor and so shouldn’t affect that aspect of the bone broth, but some will swear by the ‘scum’ adding a bad taste. So, if you don’t like the flavor, then by all means skim it off or otherwise prevent it from forming. And there are many ways to deal with it (Do You Really Need to Skim Off Scum on the Surface of Your Bone Broth?). Simply roasting the bones beforehand supposedly will reduce the amount of flotsam.

That said, it’s theoretically possible that toxins from fat could be in the scum if using feedlot bones, but if that is your concern all of the fat should be thrown away along with the scum. The body primarily stores toxins in the fat, although heavy metals can get into the bones. There are studies that show some lead in bone broth, but small amounts of lead are found in all kinds of things, including the water you drink.

Dr. Catherine Shanahan argues that the level of lead in bone broth is typically low, although it is a reason to rely on pasture-raised animal foods in general (Broth: Hidden Dangers in a Healing Food?). For other reasons of oxidation, she’d still recommend removing the fat from broth and not using it (Bone Broth Risks: Skim the Fat!). As long as it is simmered at low temperature, the ‘scum’ won’t emulsify and so will remain at the top where it can later on be removed along with the fat cap. 

But if it’s good quality animal parts, you don’t necessarily want to waste that nutrient-dense fat. Megan Stevens explains that you should make an initial batch of gelatinous meat broth in the first 2-3 hours (or 30 minutes in a pressure cooker) because the fat won’t have gone rancid yet (How to Make Bone Broth and Avoid Rancid Fat — A Complete Broth Guide). Only after the removal of the meat broth and fat would you want to do a long simmering bone broth, in one or two further batches, which gets more of the collagen and amino acids.

Not skimming fat will seal in heat and so increase the temperature. This might turn a simmer into a boil. If cooked too high, amino acids, minerals, and fat will emulsify into the broth. This would affect clarity and might affect flavor. High heat will also break down gelatin and so the bone broth won’t gel as well. For a thick soup or gravy, emulsification is a good thing and, in that case, boiling is recommended. So, this depends on personal taste and purpose.

There is a related issue with long cooking periods. Proteases will break down proteins into free amino acids and proten fragments, some of which taste bitter (What To Do With Bitter Broth?). If enough of these bitter-tasting molecules get emulsified, it could negatively affect the flavor. While simmering, skimming the coagulated proteins (i.e., ‘scum’) from the surface will help to prevent this. This will solve multiple problems by removing fat and proteins, whether or not one wants to save the fat for other purposes.

* * *

Ask The Food Lab: Can I Make Stock in a Pressure Cooker or Slow Cooker?

I’m convinced that cooks who insist that a stock must be skimmed of excess fat and scum religiously are really only saying that so they have an excuse to stand by the pot and inhale.

Broth is Beautiful

Scum will rise to the surface. This is a different kind of colloid, one in which larger molecules–impurities, alkaloids, large proteins called lectins–are distributed through a liquid. One of the basic principles of the culinary art is that this effluvium should be carefully removed with a spoon. Otherwise the broth will be ruined by strange flavors. Besides, the stuff looks terrible. “Always Skim” is the first commandment of good cooks.

Making fresh bone stock

A lot of people will tell you to skim the froth that forms at the surface of a stock as it cooks, but it’s harmless. Skimming the foam or “scum” as it’s sometimes called, is simply a matter of culinary preference and is done to create a clear broth or stock. If you don’t mind the way it looks, leave it and all the goodness that it might contain.

Silly Bone Broth Myths You Can Ditch Right Now

Myth 8: You must skim the scum from the bone broth before the impurities in the scum pollute the flavor of your bone broth.

Poor scum! I feel so sorry for it at times. Another rule from the days of crystal clear broth bites the dust. The fluffy white foam that sometimes collects at the surface of your broth is just protein from the meat and bones in your broth. The heat changes the outer surface of proteins, and causes them to change shape or denature. That’s what happens to all proteins exposed to heat during cooking – unless you just eat your food raw! Scum isn’t anything awful. It’s not blood, toxins, fat, or anything awful. Just proteins. Even the reddish juice you sometimes see on bones isn’t blood (hemoglobin). It’s myoglobin. I am cooking bones to get at the proteins, nutrients, and healthy fats. The so-called scum – it’s just protein. I don’t skim it out of the pot. After an hour or two, fat rises to the surface of the broth and those misunderstood proteins become a source of flavor and color. Our minds are taught that scum is to be avoided. We think we can taste the evil in our pots, and if you truly can taste it and it bugs you, by all means remove it. If you have clean, high quality bones, why throw away the outer layer? Did it look at you funny? No, I didn’t think so.

Bone Broth Benefits – Is It Important to Skim?

The scum has some amino acids and impurities, which could include toxins. Chefs and traditional cooks often teach to skim the scum off with a fine mesh strainer, so that the impurities are removed.
However, when I polled some of my friends who are traditional cooks, none of them skimmed the scum. When I went to some of my favorite GAPS diet websites, I found that there was a mix of responses to people who skimmed and people who did not. I must confess that I have taken the easy way out and did not skim, but I wanted to find out whether it was a good practice to have.

Pressure-Cooked Stocks: We Got Schooled.

Many cooks have an intutitive feeling that pressure cooking stocks is a bad idea.  Their reasoning isn’t related to the previous discussion and isn’t born-out by our tests. Here are the reasons they usually give (and my responses):

  • Pressure cooking will make the stock cloudy. That is incorrect. The boiling in a pressure cooker is no more violent than in a pot, so stocks don’t get any cloudier. We have done many side-by sides to prove this.
  • Pressure cooking extracts bitter components. No one has detected bitterness in pressure cooked stock we’ve made.
  • Not being able to skim the stock will introduce off-flavors. We have not noticed this in any of our tests.

How to Make the Best Chicken Stock

The final stocks were remarkably similar. If anything, the not-skimmed stock was a tiny bit clearer than the skimmed one, which definitely contradicted my expectations.

I don’t have a great explanation for this, but here’s one theory I’ve come up with: A lot of the scum that initially floats to the surface of a stock is protein from some of the meat’s fluids. When you’re making consommé, which is concentrated, crystal-clear broth, one of the classic techniques for clarifying the liquid is with a protein raft on the surface, often made from egg whites. Perhaps, at a gentle enough simmer, the protein blobs that come to the surface of the stock end up working like a consommé’s protein raft, trapping particles in the broth and clarifying it in the process. If the stock is simmered and handled gently enough, those impurities won’t be distributed back into the broth and can be fine-strained out.

Either way, this test suggests that as long as you keep the heat low and have a fine-mesh strainer, you’re safe letting the stock be without skimming it. As for the fat that accumulates on the surface, I find it easiest to remove once the stock has chilled and the fat has congealed on the surface.*

* It is worth mentioning, though, that I tested these stocks in smaller batch sizes. It’s possible that larger batches could generate a deeper layer of grease on the surface, which, in turn, could affect the stock’s flavor and clarity in a different way.

what *is* that foamy scum on chicken soup?

The scum is not just protein. The scum floats; protein is more dense than water and would sink. The floating stuff is coagulated (denatured) lipoprotein, the same “L” that is in the terms HDL and LDL (from your cholesterol workup).

Protein combined with lipid (fat) is less dense than water (the “D” in HDL and LDL refers to the density). When boiled, these lipoproteins coagulate and float. The scum tastes just fine to me, it’s laughable to call it “impurities.” All cells have lipoproteins in their cell membranes.

However, leaving them in will cause the stock to be irreparably cloudy in the end. If the goal is a stew, then who cares if the stock is cloudy. If the goal is consomme, then skim away.

I think there is a “Chinese cream stock” in which the items (pork, duck, chicken) are deliberately cooked at a rolling boil so as to incorporate all the flavors into the liquid. The result is quite creamy-looking. In this method the lipoproteins are physically forced into a colloidal suspension.

Why skim “scum” from the surface of a simmering stock?

Bruce Goldstein
Skimming is for aesthetic purposes.The scum is denatured protein, mostly comprising the same proteins that make up egg whites. It is harmless and flavorless, but visually unappealing. Eventually, the foam will break up into microscopic particles and disperse into your stock, leaving it grayish and cloudy. The more vigorously your stock bubbles, the faster this process will occur.If the grayness or cloudiness bothers you but skimming is not an option for some reason, you can always remove the micro-particulates later through the clarification process used to make consomme.

Dan C
Removing the scum makes it easier to control the temperature of the stock so you can maintain a constant simmer. If you don’t skim it off, the scum aggregates in a foamy layer on the surface, which acts as insulation. It traps more heat in the stock and can cause your stock to boil when it would otherwise be simmering. Also, since stock often sits unattended on the stove while simmering, un-skimmed stock presents a risk of boil-over.

Firstly, I agree that’s for aesthetic purposes, many Cantonese stews are very clear when served.

Secondly, some people think it influences the flavor. I think it might be related to the slaughter method. For Halal meat, almost all the blood is drained, so it doesn’t influence the taste. But usually, it’s not completely drained.

And I think if the myoglobin is not boiled, like the juice in medium steak, it’s very juicy. But if it’s boiled for a long time, it tastes less tasty.I think for chicken and beef, the difference is very small, especially when you use a slow cooker and your chicken is grass-fed. But for pork, some people think the odor of pork is stronger, maybe because of boar taint, hence you will see them skim pork ribs when they make rib stew.

Lastly, you can scoop the fat.Update: I found a thesis trying to explain this:
Cause and Prevention of Liver Off-flavor in Five Beef Chuck Muscles

It said “residual blood hemoglobin is known to contribute to liver off-flavor development”.So I guess some people are sensitive to this smell.

Bob S
There are two answers:

  1. If you are boiling meat, the scum is most likely animal fat. If you leave the scum in and just mix it together, it will add to the flavor. Though there are reasons to still remove the scum. One is that you might be trying to make a leaner more meaty flavored stock. Another reason is that pesticides in the animal’s food collect in the fat cells. You probably won’t taste it, but if you’re trying to go organic, you might want to dispose of this rather than consuming it yourself.
  2. If you are boiling vegetables, the scum will include potassium hydroxide leaching out from the vegetable matter. Potassium hydroxide, or lye, is a basic solution that will taste bitter, though won’t harm you in such minute doses. A typical westerner raised on a western diet has a dulled sense of taste and probably won’t notice the bitter, though a person from a different food culture will and as such might have a custom of skimming the scum even from boiled vegetable stocks and soups.

“Impurities” in bone broth

The foamy scum the forms is a result of blood and other proteins cooking and floating to the top of the liquid. If you’ve roasted your bones, you will not get much if any scum.

When I’ve boiled raw bones for 10 minutes (per Pho recipe), emptied water, washed bones and pan and started again, I have gotten minimal to no scum as well.

I’m not a scientist but I don’t buy this “impurity” nonsense. How is it that the “impure” stuff (whatever that even means) just happens to get foamy while everything else isn’t? I never skim my stock and it always tastes fine.

Agreed — I think “impurity” is the wrong word here. The scuzz at the top won’t render your broth inedible. It may add a slight bitterness, but the real issue is cloudiness. It’s more of an aesthetic issue if you’re making a clear soup or aspic; not an issue if you’re using the broth in a stew or gravy or blended soup. Washing the bones will help. Roasting the bones will help. Starting the broth with a cold water soak and adding vinegar helps. Skimming helps. But it’s not an essential step.

Yes. Skimming the stuff off is important for several reasons. Presentation for sure, but it also affects flavor and texture. It is not toxic to consume the blood and everything, but it does have its own taste. (I would be hard press that anyone thinks it has no flavor on its own)”You can try just tasting the floating stuff and it does have its own taste. Needless to say, when you mix it with the rest of the stock, it will alter the overall flavor which many people dislike.

I have seen it, I have tasted it, but never have I noticed any negative effects to the stock in taste, only clarity. Nor do I think there are any “impurities” in the foam that we should avoid eating.

I’d like to see some kind of source on that. I know everyone *believes* it to be true, but I’ve never seen any kind of taste test that proved it to be true. The only tests I’ve seen all indicate that skimming really isn’t worth while.

Meats and meat products have not always been as clean as they are today. Open air hanging, no refrigeration, etc led to flies and other critters leaving things behind them.The original “impurities” were bug eggs, dust, bone fragments, and whatever else you can imagine.Much of this loosened in an initial vigorous boil and floated to the top with the foam.With today’s much cleaner food handling in many countries, this isn’t much of a problem anymore. The continued use of the term “impurities” is most likely a carryover from the past, and come to mean the blood and liquid fat.

It has nothing to do with celebrities or magazines, you’re totally off base- there is nothing new about calling it bone broth. There’s a lot of info out there if you’re genuinely interested (I’m sure your Google works as well as any one else’s.) There is a lot of interesting research that was done on gelatin up until the 50s when food companies figured out how to chemically synthesize natural flavors. Long cooked bone broths contain glycosaminoglycans like hyaluronic acid and chondroitin sulfate, proline and glycine, glucosamine, land animal bones are rich in calcium, magnesium, potassium, and phosphorus- fish are also rich in iodine. The bones literally crumble when a batch of bone broth is done.

Sick White Middle Class Children Are Our Most Precious Commodity

As one sees on occasion in the news, there was a local story of a child with a disease that was being treated. The purpose is to elicit sympathy and/or inspiration, which is unsurprising and worthy. But it gets one thinking when considering the details and narrative frame. The reporting is often about how the community came together to raise money or otherwise help the child and the family. It’s a feel-good story that follows a particular kind of script. As important is picking the right child for the lead role in the drama.

In this case, the child was a cute, white, middle class girl. She was photogenic according to what our society deems good looking, even with her hair loss from chemotherapy. That is the basic profile of nearly every human interest story of this sort. It’s not just any kid that becomes the focus of a human interest story. There has to be hundreds of sick kids in the area that are some combination of less attractive, impoverished, and non-white. But rarely does a major news media outlet tell their stories of suffering and struggle, of overcoming the odds.

That is assuming they overcome the odds. No one reports on the poor kid who died because the parents couldn’t afford healthcare, who was slowly poisoned from lead toxicity because they lived in a poor industrial area, or some other sad demise. No one reports on the black kid who when sick the community didn’t come together because the community was majority white and the family had been excluded and isolated. What we don’t see in the news tells us as much as what we do see.

It reminds one of the studies done on news reporting of criminals. Black criminals are more likely to have their photographs shown than white criminals. This creates the perception that almost all crime is commited by non-whites. The news media teaches and trains us in thinking who deserves sympathy and who does not. The world is divided up as innocent well-off whites who must be saved and criminal poor blacks who must be condemned. News reporting is a morality tale about maintaining the social order.