elite

Of, By, and For the Elite: The Class Character of the U.S. Constitution

By Crystal Kim

Republished from Liberation School.

Contrary to the mythology we learn in school, the founding fathers feared and hated the concept of democracy—which they derisively referred to as “tyranny of the majority.” The constitution that they wrote reflects this, and seeks to restrict and prohibit involvement of the masses of people in key areas of decision making. The following article, originally written in 2008, reviews the true history of the constitution and its role in the political life of the country.

The ruling class of today—the political and social successors to the “founding fathers”—continues to have a fundamental disdain for popular participation in government. The right wing of the elite is engaged in an all-out offensive against basic democratic rights and democracy itself. This offensive relies heavily on the Supreme Court and the legal doctrine of constitutional “originalism”. Originalism means that the only rights and policies that are protected are ones that are explicitly laid out in the constitution, conforming with the “original” intentions of the founders. As the article explores, this was a thoroughly anti-democratic set up that sought to guarantee the power and wealth of the elite.

Introduction

In history and civics classrooms all over the United States, students are taught from an early age to revere the “Founding Fathers” for drafting a document that is the bulwark of democracy and freedom—the U.S. Constitution. We are taught that the Constitution is a work of genius that established a representative government, safeguarded by the system of “checks and balances,” and guarantees fundamental rights such as the freedom of speech, religion and assembly. According to this mythology, the Constitution embodies and promotes the spirit and power of the people.

Why, then, if the country’s founding document is so perfect, has the immense suffering of the majority of its people—as a result of exploitation and oppression—been a central feature of the U.S.? How could almost half of the population be designated poor or low income? Why would the U.S. have the world’s largest and most extensive prison system? If the Constitution, the supreme law of this country, was written to protect and promote the interests of the people, why didn’t it include any guarantees to the most basic necessities of life?

This contradiction between reality and rhetoric can be understood by examining the conditions under which the U.S. Constitution was drafted, including the class background of the drafters. Although it is touted today as a document enshrining “democratic values,” it was widely hated by the lower classes that had participated in the 1776-1783 Revolutionary War. Popular opposition was so great, in fact, that the drafting of the Constitution had to be done in secret in a closed-door conference.

The purpose of the Constitution was to reorganize the form of government so as to enhance the centralized power of the state. It allowed for national taxation that provided the funds for a national standing army. Local militias were considered inadequate to battle the various Native American nations whose lands were coveted by land speculators. A national army was explicitly created to suppress slave rebellions, insurgent small farmers and the newly emerging landless working class that was employed for wages.

The goal of the Constitution and the form of government was to defend the minority class of affluent property owners against the anticipated “tyranny of the majority.” As James Madison, a principal author of the Constitution, wrote: “But the most common and durable source of factions [dissenting groups] has been the various and unequal distribution of property” [1].

The newly centralized state set forth in the Constitution was also designed to regulate interstate trade. This was necessary since cutthroat competition between different regions and states was degenerating into trade wars, boycotts and outright military conflict.

The U.S. Congress was created as a forum where commercial and political conflicts between merchants, manufacturers and big farmers could be debated and resolved without resort to economic and military war.

Conditions leading to the U.S. Revolution

To understand the class interests reflected in the Constitution, it is necessary to examine the social and economic conditions of the time. In the decades leading up to the U.S. revolutionary period, colonial society was marked by extreme oppression and class disparities.

The economies of the colonies were originally organized in the interests of the British merchant capitalists who profited by trade with the colonies. These interests were guaranteed by the British monarchy headed by King George III. In the southern colonies like Virginia, Georgia and the Carolinas, a settler class of slave-owning big planters grew rich providing the cotton that fed Britain’s massive textile manufacturing industry.

In the northern colonies, merchant economies in the port cities and associated small manufacturing industries formed the basis for the division between rich and poor. In the countryside, huge landowners who owed their holdings to privilege in Europe squeezed the limited opportunities of small farmers.

In 1700, for example, 75 percent of land in colonial New York state belonged to fewer than 12 individuals. In Virginia, seven individuals owned over 1.7 million acres [2]. By 1767, the richest 10 percent of Boston taxpayers held about 66 percent of Boston’s taxable wealth, while the poorest 30 percent of taxpayers had no property at all [3]. Similar conditions could be found throughout the colonies. Clearly, there was an established ruling class within the colonies, although this grouping was ultimately subordinate to the British crown.

On the other hand, the majority of society—Black slaves, Native Americans, indentured servants and poor farmers—experienced super-exploitation and oppression. Women of all classes had, like their peers in Europe, no formal political rights.

With these growing class antagonisms, the 18th century was characterized by mass discontent, which led to frequent demonstrations and even uprisings by those on the bottom rung of colonial society.

Between 1676 and 1760, there were at least 18 uprisings aimed at overthrowing a colonial government. There were six slave rebellions as well as 40 riots like the numerous tenant uprisings in New Jersey and New York directed against landlords [4]. Many of these uprisings were directed at the local elite and not the British Empire.

This local elite in colonial society found itself squeezed between the wrath of the lower working classes, on one side, and the British Empire, on the other.

Following the 1763 British victory in the Seven Years’ War in Europe, which included the so-called French and Indian War in North America, the French position as a colonial power competing with Britain was seriously downgraded as a result of their defeat. The French did send troops and military aid to support the colonists in their war for independence from Britain a decade later.

Following the defeat of the French in 1763, George III attempted to stabilize relations with Native Americans, who had fought primarily alongside the defeated French, by issuing the Proclamation of 1763. This decree declared Indian lands beyond the Appalachians out of bounds for colonial settlers, thereby limiting vast amounts of wealth the settlers could steal from the indigenous people. Chauvinist expansionism thus became fuel for anti-British sentiment in the colonies.

Making matters worse for the colonists, the British Empire began demanding more resources from the colonies to pay for the war. In 1765, the British Parliament passed the fourth Stamp Act, basically increasing taxes on the colonists. The Stamp Act of 1765 incited anger across all class strata, including British merchants, and was ultimately repealed in 1766.

The struggle around the Stamp Act demonstrated a shift in power relations between the colonists and the British Empire. While the local American elites were in less and less need of Britain’s assistance, the British Empire was in ever growing need of the wealth and resources of the colonies.

In summary, there were at least four factors that would motivate the American “new rich” to seek independence from the British crown. First, the anger of the poor and oppressed against the rich could be deflected from the local elite and channeled into hatred of the British crown—developing a new sense of patriotism. Second, the wealth produced and extracted in the colonies would remain in the pockets of the local ruling class rather than being transferred to the British Empire. Third, the local ruling class would greatly increase its wealth through the confiscation of property of those loyal to Britain. And lastly, independence would nullify the Proclamation of 1763, opening up vast amounts of Native land.

Two points qualified the drive to independence, which ultimately manifested itself in the sizable “Loyalist” or pro-British population during the revolution. First, despite the conflict between the colonists and the British government over wealth, colonists and colonizers were united against the Native American population, whom both tried to massacre and loot. The revolutionary struggle was not against exploitation, but to determine who would do the exploiting.

Secondly, in spite of the disputes over who got how much of the wealth generated by the colonies, this wealth primarily depended on the integration of the economy with British merchant capitalism. While the revolutionists wanted political distance from the empire, they could not afford a complete break.

The leaders of the U.S. Revolution

Revolutionary sentiment among the lowest classes of colonial society was largely spontaneous and unorganized. Leadership of the anti-British rebellion, groups like the Sons of Liberty, originated from the middle and upper classes. Some poor workers and farmers did join their ranks, allowing their leadership to garner popular support.

These leaders were conscious of the fact that only one class would be really liberated through independence from Britain: the local ruling class. However, in order to carry this out, they would have to create a façade of liberating the masses.

This is why the 1776 Declaration of Independence—the document used to inspire colonists to fight against Britain—includes language that was so much more radical than that of the 1787 U.S. Constitution. In fact, Thomas Jefferson had originally drafted a paragraph in the Declaration of Independence condemning George III for transporting slaves from Africa to the colonies and “suppressing every legislative attempt to prohibit or to restrain this execrable commerce” [5]. Jefferson himself personally owned hundreds of slaves until the day he died, but he understood the appeal such a statement would have.

Instead, the final draft of the Declaration accused the British monarchy of inciting slave rebellions and supporting Indian land claims against the settlers. “He [the king] has incited domestic insurrection amongst us,” the final version read, “and has endeavored to bring on the inhabitants of our frontiers, the merciless Indian Savages.”

Sixty-nine percent of the signers of the Declaration of Independence held colonial office under England. When the document was read in Boston, the Boston Committee of Correspondence ordered the townsmen to show up for a draft to fight the British. The rich avoided the draft by paying for substitutes, while the poor had no choice but to fight.

Slavery existed in all 13 British colonies, but it was the anchor for the economic system in the mid-Atlantic and southern states.

Thousands of slaves fought on both sides of the War of Independence. The British governor of Virginia had issued a proclamation promising freedom to any slave who could make it to the British lines—as long as their owner was not loyal to the British Crown. Tens of thousands of enslaved Africans did just that. Thousands managed to leave with the British when they were defeated, but tens of thousands more were returned to enslavement after the colonies won their “freedom” in 1783.

Following the 1783 Treaty of Paris, which established the independence of the colonies, vast amounts of wealth and land were confiscated from Loyalists. Some of this land was parceled out to small farmers to draw support for the new government.

While most Loyalists left the United States, some were protected. For instance, Lord Fairfax of Virginia, who owned over 5 million acres of land across 21 counties, was protected because he was a friend of George Washington—at that time, among the richest men in America [6].

The drafting of the Constitution

In May 1787, 55 men—now known as the “Founding Fathers”—gathered in Philadelphia at the Constitutional Convention to draft the new country’s legal principles and establish the new government. Alexander Hamilton—a delegate of New York, George Washington’s closest advisor and the first secretary of the treasury—summed up their task: “All communities divide themselves into the few and the many. The first are the rich and well-born, the other the mass of the people… Give therefore to the first class a distinct permanent share in the government” [7]. Indeed, the task of the 55 men was to draft a document that would guarantee the power and privileges of the new ruling class while making just enough concessions to deflect dissent from other classes in society.

Who were the Founding Fathers? It goes without saying that all the delegates were white, property-owning men. Citing the work of Charles Beard, Howard Zinn wrote, “A majority of them were lawyers by profession, most of them were men of wealth, in land, slaves, manufacturing or shipping, half of them had money loaned out at interest, and 40 of the 55 held government bonds” [8].

The vast majority of the population was not represented at the Constitutional Convention: There were no women, African Americans, Native Americans or poor whites. The U.S. Constitution was written by property-owning white men to give political power, including voting rights, exclusively to property-owning white men, who made up about 10 percent of the population.

Alexander Hamilton advocated for monarchical-style government with a president and senate chosen for life. The Constitutional Convention opted, rather, for a “popularly” elected House of Representatives, a Senate chosen by state legislators, a president elected by electors chosen by state legislators, and Supreme Court justices appointed by the president.

Democracy was intended as a cover. In the 10th article of the “Federalist Papers”—85 newspaper articles written by James Madison, Alexander Hamilton and John Jay advocating ratification of the U.S. Constitution—Madison wrote that the establishment of the government set forth by the Constitution would control “domestic faction and insurrection” deriving from “a rage for paper money, for an abolition of debts, for an equal distribution of property, or for any other improper or wicked project.” During the convention, Alexander Hamilton delivered a speech advocating a strong centralized state power to “check the imprudence of democracy.”

It is quite telling that the Constitution took the famous phrase of the Declaration of Independence “life, liberty and the pursuit of happiness” and changed it to “life, liberty and property.” The debates of the Constitutional Convention were largely over competing economic interests of the wealthy, not a debate between haves and have-nots.

The new Constitution legalized slavery. Article 4, Section 2 required that escaped slaves be delivered back to their masters. Slaves would count as three-fifths of a human being for purposes of deciding representation in Congress. The “three-fifths compromise” was between southern slave-holding delegates who wanted to count slaves in the population to increase their representation, while delegates from the northern states wanted to limit their influence and so not count slaves as people at all.

Furthermore, some of the most important constitutional rights, such as the right to free speech, the right to bear arms and the right to assembly were not intended to be included in the Constitution at all. The Bill of Rights was amended to the Constitution four years after the Constitutional Convention had adjourned so that the document could get enough support for ratification.

As a counter to the Bill of Rights, the Constitution gave Congress the power to limit these rights to varying degrees. For example, seven years after the Constitution was amended to provide the right to free speech, Congress passed the Sedition Act of 1798, which made it a crime to say or write anything “false, scandalous or malicious” against the government, Congress or president with the intent to defame or build popular hatred of these entities.

Today, many people look to the Constitution—and especially to the Bill of Rights—as the only guarantor of basic political rights. And while the Constitution has never protected striking workers from being beaten over the heads by police clubs while exercising their right to assemble outside plant gates, or protected revolutionaries’ right to freedom of speech as they are jailed or gunned down, the legal gains for those without property do need to be defended.

But defending those rights has to be done with the knowledge that the founding document of the United States has allowed the scourge of unemployment, poverty and exploitation to carry on unabated because it was a document meant to enshrine class oppression. A constitution for a socialist United States would begin with the rights of working and oppressed people.

During the period leading to the second U.S. Revolution, commonly known as the Civil War, militant opponents of slavery traveled the country to expose the criminal institution that was a bedrock of U.S. society. On July 4, 1854, abolitionist William Lloyd Garrison burned a copy of the Constitution before thousands of supporters of the New England Anti-Slavery Society. He called it a “covenant with death and an agreement with hell,” referring to its enshrining of slavery.

The crowd shouted back, “Amen” [9].

Although slavery has been abolished, the property that is central to the Constitution—private property, the right to exploit the majority for the benefit of the tiny minority—remains. In that sense, Garrison’s words still ring true.

References

[1] James Madison, Federalist Papers, No. 10. Availablehere.
[2] Michael Parenti,Democracy for the Few, 9th ed. (Boston: Wadsworth, 1974/2011), 5.
[3] Howard Zinn,A People’s History of the United States(New York: Longman, 1980), 65.
[4] Ibid., 59.
[5] Ibid., 72.
[6] Ibid., 84.
[7] Cited in Howard Zinn,Declarations of Independence: Cross-Examining American Ideology(New York: Harper Collins, 1990), 152.
[8] Zinn,A People’s History of the United States, 89.
[9] Zinn,Declarations of Independence, 231.

What's Their Endgame?

By Steve Lalla

Republished from Orinoco Tribune.

Invariably, in a conversation about environmental destruction, war in the Middle East, or the pandemic, someone eventually asks the question: “yes, but what’s their endgame?”

Behind this question is the assumption that an elite cabal of capitalist overseers controls everything and foresees the outcome of all their decisions—that they possess an unassailable plan that can never be defeated. Behind this question lurks capitalist realism, characterized by Zizek and Jameson as the mental state in which it’s easier to imagine the end of the world than it is to imagine the end of capitalism.

We find ourselves at the notorious 'end of history' trumpeted by Francis Fukuyama after the fall of the Berlin Wall,” wrote Mark Fisher in Capitalist Realism: Is There No Alternative? “Fukuyama's thesis that history has climaxed with liberal capitalism may have been widely derided, but it is accepted, even assumed, at the level of the cultural unconscious.”

This mindset is so common that many of us don’t think twice before asking “what’s their endgame?” We believe that in posing the question we’re making a meta-critique of capitalism. The mere formulation of the question supposes that there’s such a thing as organization—a system—a concept that in itself is truly revolutionary for many of us, who don’t even realize that we live within a system, and that alternate ones are possible.

The question “what’s their endgame?” is often posed in the context of the pandemic. In this case the assumption behind the question is that capitalism, if it wanted to, could have reacted better to the pandemic, and could have saved more lives. Therefore, the pseudo-intellect wonders, was there perhaps not a master plan behind letting hundreds of thousands die? Perhaps the US colluded with the leaders of Brazil, Britain, China, Cuba, and the United Nations, and they all agreed on a plan to usher in a police state? The theory falls apart when we accept that various communities had divergent responses to COVID-19. It’s more likely that the agenda behind allowing mass death was the same that capitalists always have: to make as much money as possible with little thought for the consequences.

On the topic of environmental destruction, we aver that billionaires are building spaceships to colonize Mars; that they’ve already planned for the destruction of planet earth and its ecosystems. Whether it’s the singularity, or life on Mars as imagined in books and film, we’ve internalized the idea that the contamination of earth and humanity’s exile into space—or at least an elite fraction of humanity—is actually an ingenious plan devised by the super-intelligent billionaires, not the inevitable result of an economic system that ignores the most basic principles of nature in order to line the pockets of the oligarchy.

Global war? We’re supposed to believe that the imperialist wars in Syria, Afghanistan, or Yemen, are going exactly as planned; that the US withdrew from Iraq in 2011 strategically—the US actually won the war. We’ve internalized the fallacy that Vietnam’s Resistance War Against America went according to US plan; that powers beyond our control act unilaterally on our beings, that we’re not actors in history.

I’m reminded of a comment someone posted on my Facebook wall this week: “if one wishes an answer, there is none, to war and chaos. When it all spills over it becomes a world war, and history repeats itself, over and over.” There are elements of truth to these ideas, which is why they’re so appealing.

“This malaise, the feeling that there is nothing new, is itself nothing new of course,” wrote Fisher.

History repeats itself: an appealing idea

Perhaps the root of this idea is the oft-repeated saying that “those who do not learn from history are doomed to repeat it.” It turns out that this is really an altered version, or a misquote, of the original written by Spanish philosopher Jorge Santayana in 1905: “those who cannot remember the past are condemned to repeat it.”

From this deeply ingrained idea we extrapolate the idea that history repeats itself—but that’s not at all what Santayana wrote. Neither of these maxims is meant to teach us that history repeats itself. Santayana’s statement warns us that if we don’t learn from history, we won’t be involved in the making of the future. While cyclical elements are certainly involved in both nature and in human history, to believe that either nature or history truly repeats itself is simply to bow out of the game—to quit before we’ve even played. A similar logic is at play when we ask “what’s their endgame?”

The reasonable alternative is to recognize that reality is always changing, and will continue to change; to recognize that humans play an active role in creating the future. Our realities are not determined by forces entirely outside of our control. This recognition gives us both strength and optimism.

 “For revolutionary hope to come into being, we need to discard determinism,” writes Yanis Iqbal. “Instead of dialectically locating an individual in the interconnected economic, political and cultural systems, institutions and structures, determinism considers him/her to be unilaterally influenced by it. A determinist conception is based on the dichotomous division of existence into an ‘external world’ and ‘human consciousness.’ In this conception, the external world and consciousness are two different components of human existence.”

In truth, humans are not separate from the world around us. It’s self-evident that we’re part of it. We require air, water, and sustenance to live and to think. Conversely, we alter and metabolize the world around us by our existence. The oligarchy and the elite, while they may live in ivory towers, are subject to the same forces of nature as we are, with the same powers and limits of agency. They may have various plans, and diverse strategies, but none of this ensured that their plans worked perfectly in the past, nor will in the future.

Despite its popularity in academia—particularly in philosophy, cultural studies and postmodernism—it’s easy to demonstrate that capitalist realism is incorrect. One only has to imagine other mental states or cultural tropes such as apartheid realism, feudalist realism, or hunter-gatherer realism.

Understanding and analyzing the plans of our opponents or enemies is important. The assumption that we are powerless before them is fallacious and futile.

Often, what lies beneath the question “what’s their endgame?” is conspiracy theory. Yes, many elements of conspiracy theory are certainly true and yes, groupings of like-minded people marshal increased power to guide events. However, blind adherence to conspiracy theory ignores the self-evidence of the greatest conspiracy of all: that we, as humans, all conspire together to create the future.

The Wall Street Journal's Pitch for Mass Murder is Catching on in Capitalist Circles

By J.E. Karla

Not even two weeks into an extraordinary response to the novel coronavirus outbreak, the upper echelons of capital are wondering whether saving millions of lives is really worth the damage being done to their investment portfolios. According to reports, the debate among the ruling class is over whether or not to walk back some of the measures taken to slow the spread of the virus -- efforts already considered tardy and inadequate by public health experts -- in order to minimize business losses. 

Like many elite notions, this idea was first launched in the editorial pages of the Wall Street Journal. An unsigned editorial there is the most visible the vanguard of the bourgeoisie ever really make their deliberations, and this one last week (behind a paywall, of course) was especially candid.

After opening paragraphs congratulating the response to date, hoping that “with any luck” the nation’s health care system won’t collapse, they lay out their basic thesis:

“Yet the costs of this national shutdown are growing by the hour, and we don’t mean federal spending. We mean a tsunami of economic destruction that will cause tens of millions to lose their jobs as commerce and production simply cease. Many large companies can withstand a few weeks without revenue but that isn’t true of millions of small and mid-sized firms.”

After some attempts at pulling heart strings over the entrepreneurs that will eat the most shit in the months to come -- using the petit bourgeoisie as human shields for big business, as is custom -- and some other telling admissions we’ll return to, they end with this:

“Dr. (Anthony) Fauci (Director of the National Institute of Allergy and Infectious Diseases) has explained this severe lockdown policy as lasting 14 days in its initial term. The national guidance would then be reconsidered depending on the spread of the disease. That should be the moment, if not sooner, to offer new guidance on what might be called phase two of the coronavirus pandemic campaign.” 

They do not have the guts to explicitly state that this “phase two” would mean allowing most normal activity -- the contact the virus needs to continue its spread -- to return, but their weasel word description of “substantial social distancing… in some form” (emphasis mine) says it all. “This should not become a debate over how many lives to sacrifice against how many lost jobs we can tolerate… But no society can safeguard public health for long at the cost of its overall economic health.”

They don’t want to debate how many lives to sacrifice in the name of saving “jobs,” -- a euphemism for the fortunes of employers, the bourgeoisie -- but that’s a great way to describe dialing back the only measures so far demonstrated to work against this plague in the name of economic “health.” 

How many lives are we talking about? As I write, 565 people have died of the disease in the United States, with fatalities doubling every 2-3 days. The experience in Europe and China indicates that response measures take roughly a week to slow the virus down. That means that we should see 2-3 more doublings before last week’s actions finally take effect, 2260 to 4520 dead people this week. The Journal and their allies are suggesting that we should let those effects last a week, and then ratchet up the spread of the virus again. 

Even assuming a very optimistic scenario where the doubling drops by half -- i.e. to once every 4-6 days -- and then lands somewhere in the middle -- say 3-5 days -- that would mean somewhere between 72,000 and nearly 600,000 dead people just a month from now. 

But it’s worse than that, because there are about 5 times as many critical cases as there are fatalities. The absolute best case scenario puts us at more than 360,000 critical cases in a country with less than 100,000 intensive care beds. The worst case puts us at 3,000,000. 

You can then add thousands of deaths from non-coronavirus causes that could not get adequate treatment -- car accidents, allergic reactions, heart attacks, etc. And that month cut off is arbitrary; the deaths would continue after that. In the New York Times Nicholas Kristof quoted a British epidemiologist as estimating a best case of 1.1 million. That best case involves much more distancing than what the Journal and company are proposing. They are calling for hundreds of thousands of people, perhaps millions, to be sacrificed for the sake of “economic health.” 

This blood thirsty logic is precisely the sort of thing capitalists project onto communists. This, however, brings us to the admission I alluded to above, buried in the middle of the editorial:

“Some in the media who don’t understand American business say that China managed a comparable shock to its economy and is now beginning to emerge on the other side. Why can’t the U.S. do it too? This ignores that the Chinese state owns an enormous stake in that economy and chose to absorb the losses. In the U.S. those losses will be borne by private owners and workers who rely on a functioning private economy. They have no state balance sheet to fall back on.”

We don’t need to debate the class character of the Chinese state -- even the Communist Party of China will admit that “socialism with Chinese characteristics” accommodates global capital. Regardless, the Wall Street Journal openly admits that the options at hand are a state-controlled economy capable of stemming the plague’s advance or letting potentially millions of people die for the sake of sustaining a privately-owned one. 

The US government could easily freeze all debts, rents, and other contractual payments, guarantee a short-term income for all families, and take all necessary measures to maintain provision of food, medicine, utilities, and vital services until the virus has run out of steam. But even a momentary economy run on the basis of human need and not the accumulation of profit poses the threat of a good example. It’s bad enough that China does it incompletely, hence official bellicosity against them even in this hour of mutual need. 

There is no amount of human lives the ruling class wouldn’t trade to prevent that risk, especially when they know they are the least likely to die.  

The only silver lining is that one way or the other most of us will come out on the other end of this nightmare, and when we do the argument we must make is clear: capitalism will continue to kill us by the millions and billions until it is stopped. You don’t even have to take our word for it -- you can read it in the paper. 

The Capitalist Coup Called Neoliberalism: How and Why It Went Down

By Colin Jenkins

Rich people have always had class consciousness because... they want to stay rich. This collective consciousness led the "founding fathers" of the United States to set up systems of governance that would, first and foremost, protect them (the wealthy, landowning minority) from the landless, working majority (slaves, indentured servants, laborers). Since then, the rich have had undue influence on every aspect of US life: housing, food production and distribution, education, media, and politics. As capitalism has developed well into its late stages, this has led to large concentrations in wealth and power, and thus influence.

In order to maintain control, the rich have learned over time that minimal concessions must be given to the working class to avoid societal unrest. Marxist theorists like Antonio Gramsci and Nicos Poulantzas described this process as using the state to steady the "unstable equilibrium." This instability is produced by capitalism's tendency to pool wealth at the top while dispossessing the majority. For much of the 20th century, capitalists in the US were successful in maintaining an internal equilibrium, mainly due to their ravaging of the so-called "third world" through colonialism and imperialism. With this massive theft of resources throughout the global South (Africa and Latin America), a robust "middle class" was carved out from a mostly white sector of the US working class. This "middle class" consisted of workers who were provided a greater share of the stolen loot than their class peers, and thus awarded the "American Dream" that was widely advertised.

The US "middle class" was a crucial development for the rich because it provided a buffer between them and the masses. Due to the relative comfort they were allowed, "middle-class" workers were more likely to support and collaborate with capitalists, even at the expense of their fellow workers who were left struggling for scraps from below. After all, for there to be a middle class, there must be a lower class. Under capitalism, the lower class is the invisible majority by design. The capitalist class shapes dominant culture from above, the middle class serves as the standard bearer of this culture, and the lower class clings to survival mode in the shadows of society. The key for the rich is to keep the invisible majority in check. The "middle class" has always played a crucial role in this.

Despite this balancing act that was maintained for decades, capitalism's internal contradictions became predictably volatile heading into the latter part of the century, culminating into what economist Michael Roberts refers to as the profitability crisis of the 1970s . As the capitalist system was approaching this crisis, US society had already begun confronting social ills stemming from systemic white supremacy, patriarchy, and the Vietnam war. Naturally, this moved into the economic sphere, as workers and students began to successfully tie together the array of social injustices to the widespread economic injustice created by the capitalist system. The existence of an invisible majority, the victims of capitalism and its corollary systems of oppression, was uncovered. This scared the rich, enough to where they felt the need to fortify their previously unshakable privileges. After the groundswell of liberation movements that formed during the 60s, which was fueled by a wave of (working) class consciousness from below, the rich decided to organize and weaponize their own (capitalist) class consciousness to protect their assets, collectively, from the threat of democracy.

In examining what had gone wrong in the 60s and why so many people had the audacity to demand more self-determination, the notorious Trilateral Commission convened in 1973, bringing together economic and political elites from North America, Europe, and Japan. The Commission, as described by Encyclopedia Britannica , "reflects powerful commercial and political interests committed to private enterprise and stronger collective management of global problems. Its members (more than 400 in the early 21st century) are influential politicians; banking and business executives; media, civic, and intellectual leaders."

In 1975, Michel Crozier, Samuel P. Huntington, and Joji Watanuki published a report for the Commission, titled: "The Crisis of Democracy: On the Governability of Democracies." In assessing the various movements that gained momentum in the 60s (racial justice, economic justice, anti-war, etc.), the report determined that these "problems" stemmed from an "excess of democracy." Huntington specifically noted that, "the vitality of democracy in the United States in the 1960s produced a substantial increase in governmental activity and a substantial decrease in governmental authority." The solution to this, according to the report, was to reverse direction - decrease "governmental activity" and increase "governmental authority" to restrict democratic impulses from the masses and maintain the capitalist power structure internally, while retaining "hegemonic power" internationally. In other words, rather than government serving people and regulating capitalists, government should serve capitalists and regulate people.

Since maintaining a "middle class" had become such a fragile proposition, the capitalist class forged a new direction. Rather than rely on this historical buffer and continue the concessionary and fickle balancing act , they decided it would be more effective to simply take ownership of the legislative and judicial process. This process began when executive officers from several major corporations joined together to form private groups like the Business Roundtable, for the purpose of "promoting pro-business public policy." In other words, to make sure that the "excess of democracy" which occurred during the 60s would never return. Why? Because any such mass movement toward relinquishing power to the people is a direct threat to capitalist profit and corporate America's existence as a collection of unaccountable, authoritarian, exceptionally powerful, private entities. The Business Roundtable, which included executives from corporations like Exxon, DuPont, General Electric, Alcoa, and General Motors, gained instant access to the highest offices of the government, becoming extremely influential in pushing for corporate tax cuts and deregulation during the Reagan era.

Since the 1980s, the Business Roundtable has run roughshod over American workers by using the federal government to:

- reduce consumer protections,

- obstruct employment stimuli,

- weaken unions,

- implement "free trade" agreements that spur offshoring and tax havens,

- ease environmental protections,

- increase corporate subsidies,

- loosen rules on corporate mergers and acquisitions,

- open avenues of profit in the private healthcare system,

- privatize education and social programs,

- and block efforts to make corporate boards more accountable.[1][2][3][4] [5]

As political momentum developed within corporate America, additional players jumped aboard this strategic and highly coordinated capitalist coup. While groups like the Business Roundtable targeted legislation, the US Chamber of Commerce (CoC), a "private, business-oriented lobbying group" which had already served as a popular vehicle for turning (capitalist) class consciousness into action since 1912, shifted its focus onto the court system. Since then, the CoC has used its immense resources to influence US Supreme Court decisions that benefit big business, a tactic that has become increasingly successful for them over time. The CoC's business lobby had " a 43 percent success rate from 1981 to 1986 during the final years of Chief Justice Warren Burger's tenure," a 56 percent success rate from 1994 to 2005 (the Rehnquist Court), and boasted a 68 percent success rate (winning 60 of 88 cases) during John Roberts first seven years as Chief Justice. The CoC improved even more on its pro-corporate, anti-worker attack in 2018, winning 90 percent of its cases during the court's first term. As Kent Greenfield reported for The Atlantic ,

"One measure of the [2018 term's] business-friendly tilt is the eye-popping success rate of the U.S. Chamber of Commerce, the self-proclaimed "Voice of Business." The Chamber filed briefs in 10 cases this term and won nine of them. The Chamber's victories limited protections for whistleblowers, forced changes in the Securities and Exchange Commission, made water pollution suits more difficult to bring, and erected additional obstacles to class action suits against businesses. Only the geekiest of Supreme Court watchers monitor such cases. But the Chamber pays attention, and it pays off."

Groups like the Trilateral Commission, Business Roundtable, and Chamber of Commerce have taken prominent roles on the front lines of the 40-year, capitalist slaughter of American workers, but if there was a single, powerful element that solidified this coup it was a memo written in 1971 by Lewis Powell. The Powell Memo, or Powell Manifesto, as it has come to be known, made its rounds among corporate, economic, and political elites during this crucial time. Powell, a corporate lawyer, board member of nearly a dozen corporations, and soon-to-be Supreme Court Justice, sent the memo to the Director of the U.S. Chamber of Commerce, Eugene Sydnor, Jr., as a call to action for corporate America.

Powell's memo was a diatribe against any and all elements that would dare to question capitalism. While giving mention to "Communists, New Leftists and other revolutionaries who would destroy the entire system, both political and economic," the memo focused on what was viewed as the most immediate threat - the same "excess of democracy" referred to in the Trilateral Commission's report. "What now concerns us is quite new in the history of America," wrote Powell. "We are not dealing with sporadic or isolated attacks from a relatively few extremists or even from the minority socialist cadre. Rather, the assault on the enterprise system is broadly based and consistently pursued. It is gaining momentum and converts" throughout the working class. Powell took special interest in those "from the college campus, the pulpit, the media, the intellectual and literary journals, the arts and sciences, and from politicians" whom he regarded as small in size but "the most articulate, the most vocal, the most prolific in their writing and speaking."

Powell's memo laid out a blueprint for the capitalist coup that is now referred to as neoliberalism , including everything from identifying and listing the enemies pushing for self-determination, criticizing the business community for its apathy and lack of urgency in recognizing this growing threat, suggestions for how business executives and the Chamber of Commerce may proceed in obstructing these democratic impulses from below, and even laying out detailed plans on how to infiltrate campuses, the public, media, the political arena, and the courts with pro-capitalist action and propaganda.

Reclaim Democracy, an activist organization based in Montana explains,

"Though Powell's memo was not the sole influence, the Chamber and corporate activists took his advice to heart and began building a powerful array of institutions designed to shift public attitudes and beliefs over the course of years and decades. The memo influenced or inspired the creation of the Heritage Foundation, the Manhattan Institute, the Cato Institute, Citizens for a Sound Economy, Accuracy in Academe, and other powerful organizations. Their long-term focus began paying off handsomely in the 1980s, in coordination with the Reagan Administration's "hands-off business" philosophy."

At a time of monumental capitalist regrouping and coalescing against the "dangerous rise" of self-determination, the influence of Powell's manifesto is difficult to underestimate. It provided ideological fuel to the birth of a substantial corporate lobbying industry, which produced immeasurable pro-business and anti-worker legislation for decades to come. The memo also served as a wake-up call to capitalists throughout corporate America, supplementing the formation of groups like the Business Roundtable and urging forceful actions from the US Chamber of Commerce. The results, according to Jacob S. Hacker and Paul Pierson, were undeniable:

"The organizational counterattack of business in the 1970s was swift and sweeping - a domestic version of Shock and Awe. The number of corporations with public affairs offices in Washington grew from 100 in 1968 to over 500 in 1978. In 1971, only 175 firms had registered lobbyists in Washington, but by 1982, nearly 2,500 did. The number of corporate PACs increased from under 300 in 1976 to over 1,200 by the middle of 1980. On every dimension of corporate political activity, the numbers reveal a dramatic, rapid mobilization of business resources in the mid-1970s." [6]

The real-life effects of this capitalist coup have been disastrous for most. US workers have experienced declining or stagnant wages since the 1970s. As a result, many must rely on credit (if lucky enough to qualify) even to obtain basic necessities, which has resulted in skyrocketing household debt across the board. The debt-to-disposable income ratio of American households more than doubled from 60% in 1980 to 133% in 2007. Meanwhile, any hope of saving money has disappeared. While the household "savings rate roughly doubled from 5% in 1949 to over 11% in 1982, it looks like a downhill ski slope since then," and registered in negative territory by 2006. Conversely, as designed, the rich have benefited immensely, to the point where income inequality has increased to pre-Great Depression levels . Those who orchestrated the coup (the top 1%) claimed about a quarter of all wealth during the 1980s, and now own over 40% of all wealth in the country. To put this in perspective , the bottom 90% of all Americans combined account for barely half of that, claiming 21% of all wealth.

And, perhaps most importantly, the coup helped fund the growth of a massive capitalist propaganda machine to convince the working class to support our own demise. This includes everything from a co-opted and recalibrated liberal media, a rise of right-wing talk radio, and the birth of the Fox News network - all designed to do one thing: "inform and enlighten" workers on the wonders of capitalism and American exceptionalism, the friendly nature of big business, and the "excessive" dangers of self-determination.

As Powell noted in 1971, "If American business devoted only 10% of its total annual advertising budget to this overall purpose (of marketing and selling the idea of capitalism), it would be a statesman-like expenditure." And statesman-like it has become, running interference and garnering " manufactured consent" for a capitalist coup that has been cemented over the course of four decades, six presidential administrations, a Wall Street run amok, and a massive transfer of generations (including future) of public revenue into private hands.


Notes

[1] "The Business Roundtable and American Labor," a report by J. C. Turner, General President International Union of Operating Engineers, AFL-CIO (May 1979). Accessed online at http://laborrising.com/2013/07/union-organizing-and-the-business-roundtable-and-american-labor/

[2] "The Anti-Union Game Plan," Labor Notes (July 2, 2018). Accessed online at https://labornotes.org/2018/07/anti-union-game-plan

[3] Lafer, G. (October 31, 2013) "The Legislative Attack on American Wages and Labor Standards, 2011-2012," Economic Policy Institute. Accessed online at https://www.epi.org/publication/attack-on-american-labor-standards/

[4] Gilbert, D. (2017) The American Class Structure in an Age of Growing Inequality (SAGE publications)

[5] Goldfield, M. (1989) The Decline of Organized Labor in the United States (University of Chicago Press), p. 192

[6] Hacker, J.S. & Pierson, P. (2011) Winner-Take-All Politics: How Washington Made the Rich Richer - And Turned Its Back on the Middle Class (Simon & Schuster)

Contrived Connections of Capital

By Steven L. Foster

Unpastoral Limps

I made a couple of connections while taking an early morning bike ride along tree-lined, deeply rutted and pot-marked, dirt access lanes leading me through expanses of flooded checkered rice fields, sprouting green and dotted with white heron. One connection was a barbed-wire fence newly stung and anchored by poured concrete posts rising higher than the older bamboo barriers tied to trees along the pathway designating ownership over parcels of land.

The other was an old man in a not-too-distant wooded area emerging from his tiny platformed shack constructed from corrugated sheet metal, rough-hewn wood planks, and bamboo. Remnants of a wood fire in an arched ground level mud oven smoldered. Chickens were scampering about with dogs barking-once I became sighted, and a couple of penned-up pigs grunting near a small vegetable garden. The abode was likely his year-round home, and not a makeshift shelter built in shaded areas for temporary field laborers escaping the tropical sun.

Clad in grimy clothing, he listed to his left, severely limping (I too have limps), and slowly trudging toward a beat-up grimy motorcycle with side-car. He nodded towards me in acknowledgement while calming his dogs as he hoisted a bundle of wood kindling taken from the side-car. The old man's likely working as a tenant farmer hired by the land owner who constructed the new barbed wire fencing. Less than 14 percent of the farmers in the largely agrarian country where I reside own the land they work, even though less than three decades ago 44 percent were small land-owning farmers.

Triggered by the fence and seeing the farmer-representing to me an arduous life of poverty and toil for someone so old, I briefly thought: All three of us are commodities.

A contrived connection? Yes. But, not by me. We were intentionally made commodities and had little choice in the matter. After all, who wants to be merely a commodity unless you're branded as a wealthy superstar, luring others "to be like Mike," Madonna, or Rihanna? Especially, since I believe far more intrinsic connections exist between the three of us-whether we know it or not.

In what follows, I'll briefly explore the broad historical processes in how the land, the old man, and I received our assigned roles in the socially constructed capitalist global market. Retired (also with little choice in the matter) after nearly five decades of working, I've had time reflecting on my life in a society designed to turn as much as it can into commodities where my value resided largely in making someone else profits.

Let me first clarify what is meant by commodity and the processes of commodification.

Marx's analysis of a commodity basically stresses a thing's exchange value, quite different than the use an item possesses. For instance: A new case-hardened steel axe may have higher value to the tenant farmer when gathering fire wood than his old axe. That steel axe doesn't have much use value for someone living in a Chicago high-rise building. However, a hardware store owner in selling it (she lives in a high-rise) changes the nature of value the axe possesses. It becomes valuable for the profit it makes by buying the axe at a low price, then selling it at a high price beyond initial costs of purchase and overhead. The axe is now a commodity to the store owner.

Selling everything it can as commodities is essential to capitalism. That's how profits are made. The higher the degree of profit, the more valued a commodity becomes, often outstripping its worth as a useable item.

Take as an example a sturdy handbag and an elegant Gucci satchel. Both perform similar tasks by carrying things and may even require similar material. But, the Gucci sells at a significantly higher price, very likely making much more profit, and therefore, retaining higher commodity value when sold. Not surprisingly, you don't want to harm a costly satchel by carrying potentially leaky groceries in it. Just as you'd not want to take a grocery carrying handbag-maybe stained from previously seeping fresh fruit that got squashed, to a stylishly sophisticated restaurant. You pull out the Gucci for such an occasion and not carry much in it so there'd be no conspicuous bulges breaking its lines complimenting the sleekness of your evening wear.

Economic historian Karl Polanyi noted from his study of capitalist history (modern western history): the commodification of land, labor, and money was necessary for capitalism to work as a social system (see The Great TransformationThe Political and Economic Origins of Our Time, Beacon Press edition, 2001). Without a capitalist market built around these three forms of commodities, we have something other than capitalism. He also suggested that commodifying these things stretches the seams of our social makeup where vital connections to our environments and the people in them breaks down.

We'll look at a broad history of how land and then labor changed into commodities that fueled my initial lament connecting me, the fenced land, and the (other) old man.

(I'm connecting large historical dots with thick lines in keeping with the essay's scope.)


Something Called Nature: Dominated and Sold

Modern humans have hypothetically labelled things in the world not made by us as part of nature. A tree is natural. The wood from it making a press board book shelf isn't. It's made by a human culture. The cultural worlds of people somehow became separated from a "natural" world comprised of nonhuman things: wild forests and jungles, oceans and reefs, and all the animals and strange stuff in them we study and use for our purposes. We moderns think of ourselves as minds living in bodies that we steer and engineer like a space craft from another world fashioned from material that's alien to the aliens.

Thinking and living like that's true provides us an illusion of our transcendence from a non-living world of atoms and what they combine to make, just waiting for us to give them meaning. People with minds give minds to mindless matter found in nature. Nice of us.

It's an unfounded modern concept. Our current sciences are showing the separation between nature and culture is patently artificial, a mere abstraction created by people in the past considering themselves scientific; yet, unaware of their specifically historical conditioning and the perspectival limits of knowledge saddling our finitude as humans. Examples of current scientific inquiry blurring distinctions are: quantum physics-with theories of massless (or nearly so) "fields" forming the primary essence of an organically connected universe (very likely one of many universes); molecular biology informed by this (meta)physics suggesting particles can be in two places at once that's important for respiration and nutrient absorption; sciences pursuing theories around consciousness finding it in more than just smart animals-like the self-awareness of tree communities; neurosciences questioning the existence of a unique self that's separated from "outside" experiences forming an individual; anthropology studying human cultures and still asking: what are humans and their societies-really?, etc.

But, this "Great Divide" between nature and culture persists. That's because the "Divide" has been, and continues to be, useful. (For analysis of the Divide see Bruno Latour: We've Never Been Modern, Harvard University Press, 1993, and, Beyond Nature and Culture, by Philippe Descola, University of Chicago Press, 2013. I've taken current science references from a plethora of sources).

The nature part of the Divide is seen by modernity as a thing to be subdued. The Christians of the 16th century and 17th centuries took the biblical mandate from Genesis-subduing God's creation of a natural order, in a way very different than a Hebraic understanding of the passage. The myth of humans subduing and having dominion over nature was likely viewed by the authors of Genesis as caring for and nurturing a creation as one of its creatures placed on earth as a divine vassal, totems of God's cherishing presence (humanity in the image of God). The European religious understanding shifted the meaning to what the father of the scientific method- Francis Bacon (1561-1626), proposed. Nobly concerned as he was with the plight of humans struggling against uncertainties in a natural world, his search for scientific knowledge was for bringing nature under utter control of people. Now, subduing the earth means exercising dominating power over it in a constantly raging battle to conquer it as something other than us.

It's important to recognize this understanding between us and nature when considering how land became a commodity in the capitalist world. Using land productively for profit as an individual owner sees fit, with the land passed along to other individuals through sale or inheritance as a commodity, became like a religion in the west vehemently protected by law.

To be civilized, however rich and ancient your historical traditions, you must adopt the modern capitalist understanding regarding land. The primary meaning of "the rule of law and order" is the protection of private property and the absolute rights of an individual owner over that property, especially land. This is a very modern understanding, legally codified as recently as the later part of the 19th century, though theoretically formulated by John Locke two centuries before then. If you're one of those societies resisting 'the rule of law," you're automatically thrust into being from the natural world and not of culture, or at minimum, in a nebulous area in between the two (Latour); and therefore, less human and in need of civilizing.

Land use for myriads of cultures throughout human history was not for individual exploitation, but, for communities to take what was necessary for their cultural existence, replenishing the resources when able, or moving on when unable-allowing the land to recover and revitalize itself over time. The heads of communities-both women and men, provided land allotments based on specific family needs. Larger families got more land for their sustenance. Most often, there was redistribution of pooled resources ensuring needs be met for those unable, or under-able, to care for themselves. Reciprocity was the basis of economic practices and not individual gain central to commodity exchange-now nearly a universal feature in our global capitalist cultures. In some societies, leadership was chosen based on their capacities to give away the material excesses they accumulated through war or other means and ways.

Yes, there were resource wars over land and its contents. Tribal boundaries existed designating areas used by specific communities and infringement on these territories could amount to conflict, especially in times of scarcity. However, treaties were made allowing other groups certain access rights as needed, and inter-tribal marriages brought communities together effecting allocations of combined resources.

In sum, the singularly individual ownership of land for most our globe's cultures was a totally outlandish concept.

Of course, land was controlled. Under empires, it many have fallen under the jurisdiction of individual rulers and the religions supporting them. Though, what control primarily meant was exacting tribute over populations on the dominated land, payments often in the form of produce from it. There wasn't ownership of a thing called "private property." The subsistence needs of the populous garnered from land were granted by those in authority; that's if a dominating leader wanted to remain in power. When populations were denied land access, they violently rebelled jeopardizing a ruler's position.

In ancient Greece, private ownership for the sake of individual commercial exploitation occurred as prominent men transacted with other city-states and cultures through trade networks. But, land was still provided to non-slaves by law. Of course, tribute/taxation and other services were required for a ruler's protection.

Absolute ownership by an individual becomes legally granted under Roman law, statutes passed in the Republic by the elite landed classes. It was a break with tradition. Again, land was provided as a cultural practice to plebeians-classes of commoners. Access to land was vital to Roman self-understanding since citizen farmers served in the army when called upon with payment for military service often coming through land grants. House-holding networks were central to the Roman economy with redistribution of booty from the conquered supplementing the basic house-hold units.

After the break-up of the Roman empire, access to land for all classes of people during the middle ages followed centuries of socially engrained custom. This was the right of the commons (common land use).

The lowest of peasant groupings in the constructed social hierarchy-villeins, were able to maintain subsistence from the land: growing food, using materials for housing, raising livestock, and making things needed for essential living which was also traded for other needed items based on local markets controlled by social custom. The villeins maintained about 13 acres in their modest farms, holding between 40-50 percent of all the arable land according to geographer Gary Fields ( EnclosurePalestinian Landscapes in a Historical Mirror, University of California Press, 2017, kindle location 821). In 1300 CE, nearly 50 percent of farming in England was on the commons (Fields). Landed nobility exacted tribute consuming most surpluses and would also demand services from villeins that could change like the weather.

Life was not easy. However, the deeply felt pride of self-sufficiency was part of the social fabric. Fields reveals that villeins, even though legally land insecure, still had recourse through manor courts protecting rights to the land based on custom. They were able to procure more holdings of land left fallow after the Black Plague depleted the population. Copyhold practices became a legal process in the 15th century where land occupied by villeins could be passed on to family if fines were paid to nobility.

They, like freeholding farmers, were vested in the land they worked, upgrading and maintaining both their individual lots and common fields worked by the community. It was a unique balance between individual farms and collective agrarian practices on the commons (Fields). Peasant farmers along with the manor courts ensured individuals would not dominate the commons and regulated grazing, crop rotation, and land regeneration ensuring individual farmers would preserve their lots for the benefit of the community and its future. Agricultural innovations boosting productivity were implemented well before the modern technological revolutions occurred.

All was to change by the dawning of the 16th century as England nobility, driven by a number of factors, most notably-commercial greed as mercantile capitalism was on the rise, began a long march of physically enclosing the land under their control. It was, as Fields states, "a long-term project of improving land by 'making private property' on the English landscape. This transformation represents a decisive moment in the long-standing lineage of reallocations in property rights, in which groups with territorial ambitions gained control of land owned or used by others." The result was "…eradicating common field farming and remaking a landscape that once boasted a large inventory of land used as a collective resource (loc 759)."

Very importantly, Fields points out that enclosing the land-using fences, barriers, and roads to designate individualized private property, not only meant inclusive control over it and all that it contains; but, exclusion of others. Even access passages leading to other areas of land that still held common use were denied. Boggs and forested areas were closed off from common use making hunting, food foraging, even fire wood-gathering, illegal to all except the owner. A whole way of life existing for centuries was slowly, yet systematically, dismantled as agrarian societies throughout England, Europe, and then the world lost vital access to land.

The rationale was improvement of land for commercial ends that would bolster integration of a national economy trading increasingly on a global scale. Whether turning grain producing fields into pasture or using it for monocrop growth in order to maximize exports outside of region or country, the goals were profits for elite landholders. Larger estates dominated the English landscape. The villeins, the most precarious members of society, were the first to be affected as peasant residents were expelled. Smaller holdings (yeoman farmers and copyhold villeins) were bought out or outright ousted with the rationale being "efficiency" (read-maximizing profits) in using land. What happened was an all-out assault on common field agriculture. And after the enclosing processes, whether noticeable efficiency in greater productive capacity over the long haul was really achieved or not-other than increased profits, is still a matter of debate among agricultural historians. Over grazing and soil exhaustion through large scale monocrop production had detrimental effects decades after initial increases in produce occurred over the short term because of so called improvements.

Land is now fully a commodity; a thing to be personally possessed and valued for its profitability as an investment. All others are excluded from using this commodity.

Importantly, land enclosures caused unrelenting social upheaval. It was the primary impetus for the institution of wage labor on a mass scale.


Behold! A Labor Market Is Birthed

It's always remarkable just how actions become supported by theoretical 'reason' after events have already occurred. John Locke, a founding thinker regarding land use and any "natural law" surrounding it, systematized what constituted proper land management while the English were slaughtering Indigenous peoples and taking their lands for decades prior to his ruminations. He served as a colonial bureaucrat overseeing the process in the Americas. Adam Smith, father of the "science" of modern economics, theorizer of free markets and worshipped by adorning capitalists in following generations, saw all people in history acting like his local butcher; that is, humans as essentially bartering, trucking, and trading beings driven by individual self-interest looking for personal gain. However, as Polanyi suggests, "…the alleged propensity of man to barter, truck, and exchange is almost entirely apocryphal (pg. 46)." Like the dichotomy between nature and culture, Smith's reductive speculations concerning human motivation is more useful than fundamentally truthful.

This was a necessary understanding of what constitutes the human for transforming society into capitalist culture. Polanyi notes, "The transformation implies a change in the motive of action on the part of the members of society; for the motive of subsistence that of gain must be substituted. All transactions are turned into money transactions (pg. 43)." Subsistence labor on the land must be displaced in capitalist society, workers now becoming wage earners.

The cultural change of the 19th century where, according to Polanyi, full-fledged capitalist culture occurred, was set-up in the 16 th and 17th centuries. As stated above, enclosures increasingly eliminated peasant copyholds, making remaining villeins at-will tenants, meaning they could be removed from the land at the whim of the owners without legal recourse and protections. Rents levied on leases of land and housing increased significantly over this period of time.

It wasn't that peasants didn't rebel against nobility's reneging on responsibilities as reactions were found in the high medieval period and continuing. Enclosing barriers were destroyed. Major rebellions, in Norfolk with Kett's Rebellion (1549), to the revolt at Midlands (1607) "would mark numerous protests against specific enclosures well into the eighteenth century (Fields, loc 770)." Violent crackdowns by local magistrates and state authorities met the insurgences. It was also a time of heightened religious persecutions in support of elites, including the church "inquisitions" of heretics and other evil-doers from among the lower-class rabble who were upsetting burgeoning commercial successes and calling into question questionable practices.

Incidents of poaching were on the rise throughout the long period. Capital punishments also increased, especially following armed poaching by masked raiders in the 18th century, not infrequently sending an offending party to the gallows for stealing a goose. Law enforcement groups were organized by gentry and the idyllic rural countryside was laced with precarity and the thievery and violence that accompanies it. Poaching only increased during the 19th century in spite of harsh laws inflicting weighty penalties.

After the Glorious Revolution in 1688, the merchant classes and large land owners winning the day in controlling English monarchal power, the enclosure processes began to hit full stride, not stopping until the beginning of the 20th century. With the triumph of the capitalist classes came an onslaught of parliamentary actions that assaulted the traditions of law protecting rights of common land use, passage and access rights, collective farming, and tenant occupancy rights.

How did the once landed peasantry survive other than by illegal means?

The newly un-landed hired themselves out to large land owners for wages. Wage labor became a necessity if peasantry was to survive.

A degree of wage labor did exist prior to the enclosures. Landless folks unable to fully support themselves from farming would hire out to a manor lord, as well as to yeomen farmers who held larger plots, and even to a cohort of villeins who gained more copyhold land access. There were also local and increasingly regional cottage industries making needed items for use by local communities. However, wage labor was not a typical form of sustaining life. Society largely frowned it. Even if self-sufficient peasants were themselves poor, they were independently self-supporting. The social mores around idleness were severe to say the least. A chronically offending vagabond (unemployed, unauthorized traveler between parishes), or an able-bodied beggar, could be put to death.

Unemployment, or underemployment-pauperism, as never before seen was now a regular feature of the countryside. Enclosures were a key feature propelling this, although it must be granted that other processes were also at work. There were instabilities in commodity prices making for fluctuations in available work, especially after an increasingly nationalized and globalized mercantile capitalist markets spread.

During this time of upheaval, merchant groups were procuring charters for expanding once local cottage industries into manufacturing settlements, towns making commodities for sale to a larger region and on the growing international markets that would include supplying finished goods to colonial populations abroad. Un-landed peasants supplied them with workers. The towns grew into manufacturing cities with a workforce no longer restricted to manors and feudal life under a lord as of 1795. With the Industrial Revolution in full swing, the industrial cities were fostering the inhumane conditions that Charles Dickens spoke about in the early 19 th century and Upton Sinclair in the early 20th century. Historical social research has confirmed their insights into urban squalor as the countryside was emptied of its hungry inhabitants needing wage labor for survival and travelling wherever work could be found.

And very usefully, the power of the parish craft guilds was broken. Previous to the privatizing of land as a commodity, guilds through the manor courts had authority controlling production: how many producers of given items in a geographic location averting undue competition, prices of goods and services with minimal standards of living specific to professions, as well as product quality-including worker training in all aspects of production, social support for the infirmed, etc. When agrarian society was dismantled, so was the cultural power of the guilds helping control individualistic merchant greed and manorial excesses. Meager wages now bought necessities with money and no longer through locally defined economics based on reciprocity and social convention. Now, "…all incomes must derive from the sale of something or other, and whatever the actual source of a person's income, it must be regarded as resulting from sale. No less is implied in the simple term 'market system' (Polanyi, pg. 43)." A national labor market being birthed in England was coming to full force.

Under the growing labor market income systematically fell short in meeting basic sustenance. Pauperism was becoming rampant. Previously, wages under the guilds were designed to be adequate for a worker to sustain his family appropriate to life's social stations (not everyone was to have the same standard of living). Not now. The roles of women changed along with the men, as their labor responsibilities in the mass-producing industries of factory towns were formed replacing those of the cottage. Hiring out as domestic servants took women away from the fields. Men, women and children, in making ends meet, were forced to work long, arduously monotonous hours of hard and dangerous factory labor. They were re-socialized into labor's divisions imposed from the outside-theorized in Smith's production of pins for maximizing profits, and not as workers who once made end products through all stages of manufacture.

The new labor force was disciplined and punished (M. Foucault) into new cultural configurations radically different than the centuries long traditions that formed them, traditions that were based on the cycles of nature and the harvest. Now lives were reflected in the factory and the inhuman drudgery it imposed. It was socially fracturing contributing to alcoholism, domestic abuse, petty thievery, and the ills and disease that infested their new environs.

Was there social relief for an enlarged population in distress? There was.

Beginning in 1494 and continuing through 1547 and beyond, laws were formed distinguishing between the deserving and undeserving poor. Those deemed deserving had recourse to support in special living arrangements-poor houses, with orphans and children of single mothers living with the elderly and those unable work-the infirmed. Those undeserving served under harsh workhouses that only grew harsher as time went on. But, as the feudal social system continued breaking down, more vagabondage, begging, and unemployment occurred growing the populations of the work-houses and debtors prisons.

The later part of Queen Elizabeth I's reign witnessed Poor Laws enacted, the 1601 laws codifying the previously established legislations and harsh penalties surrounding begging and undocumented travel between locales, and more systematically defined what was meant by being poor. It also provided relief for the pauperism plaguing the crown's subjects. These laws were administered at the local parish levels with lots of disparity between the parishes- who gets and how much assistance was given. The funding was also local through compulsory taxation rates on land, pressuring mostly smaller landholders and business people, not the elite whose personal wealth was exempted from the tax. Keeping costs down was a big motivation. Population movement to parishes better off and with better benefits was restricted with stark limits placed on travelling without appropriate permits in the mid-17th century.

The Speenhamland System, enacted around the same time of nationally "freeing" peasants for travel between parishes (1795) so they could find work, suggested an allowance system based on the price of bread and family size since grain prices were increasing primarily due to England's part in fighting the French revolutionaries, then the Napoleonic Wars, and subpar harvests. It was intended to supplement the cries of low wage earners for a "right to live," something wage rates did not afford. The system was a failure on many accounts.

Speenhamland, though national, was administered unevenly on the local level, and mostly in rural areas where peasant unrest was an ever-present reality. Little assistance went to the newly urbanized populations who also needed it. It became too easy for a worker not to work, or to not produce on a level of one's reasonable capacity, since there was a guaranteed income, even though survival on it was dismal. This was demoralizing and degenerating to long social traditions of self-sufficiency and the benefits work provided for families and their cohesion. Being "on the dole" was too similar to the social stigma attached to the work and poorhouses even though the system paid benefits 'outdoor' with recipients not having to live in one of the houses of disrepute if receiving assistance. The local taxation pressure placed on the smaller landholding and manufacturing employers made for deep animosity between those receiving benefits and those supporting the system.

The taxation levels only grew because of an important central flaw to a well-meaning system. There was never any pressure for raising wage rates. Quite the contrary. Employers lowered the wage rates as much as they could, knowing the system would make up the difference. Further, commodity prices remained high since public money was provided based on the price of buying bread. It also made what jobs were available more unstable when profits levels fluctuated and an employer could ready eliminate positions knowing workers would retain a basic income regardless. When it became clearer that the program hurt capitalist production in the long run, important public opinion decried the Speenhamland system, again-a program well intentioned, but, poorly conceived. Though costly to those paying the funding rates, it mostly hurt and dispirited the intended recipients-the working poor.

Prominent public policy figures and theorizers, such as Edmund Burke, Jeremy Bentham, David Ricardo, and Thomas Malthus railed against the relief. Foremost, the lack of a "free" self-regulating market for labor was being jettisoned by Speenhamland. Ricardo suggested that wages would "naturally" stabilize at a basic survival level. Malthus felt 'right to live' wages would only encourage the poor to reproduce beyond the capacity of the land to sustain human life and starvation was a natural phenomenon for necessary population control. For him, the poor only waste any wage surplus in the ale house. (The Reverend Malthus appears an avatar for some of our current forms of (un)Christianity.) In the end, the old Poor Laws under Speenhamland were repealed and new Poor Laws enacted in 1834, cruelly sharpening the conditions of relief and removing outdoor assistance, making all who received support live in the workhouse, separating family members and deleting any dignity to life in a place worse than the poorest of working people's conditions.

At bottom, under the new Poor Laws, being poor was declared a moral character deficiency.

Polanyi indicates when the new Poor Laws of 1834 were implemented, a completely commodified labor market was born, the last of his three processes of commodification necessary for shaping capitalist society. Removed from subsistence sources and the land supporting agrarian communities, people were now left for a labor market to decide wage payments, hunger motivating a workforce to take whatever wages were offered them.

Yet, ironically, capitalism created its own critics and chief nemeses in a phenomenon labelled the proletariat.


Retorts and Exports

Though we've engaged an English commodification history, the process didn't stop within those borders. Other like-minded countries of Europe quickly followed the economic path the English elite trod, they too transforming their agrarian societies, such as: the 16th century independent Netherlands, France, Germany, late-comer Russia, along with a number of others joining the capitalist profits parade. However, it wasn't as if it was smooth sailing in socially transforming cultures into capitalist societies.

Retorts of seething revolutionary rumblings burst into action in the revolts of 1830, widespread throughout Europe, where demands were made for greater public participation through parliaments forming their country's direction and placing limits on monarchies. Even more unsettling to the system run by ruling elites were the revolts of 1848 (the People's Spring) involving a large part of western Europe that demanded greater democratic voice in national affairs. It didn't stop there as the Paris Commune shook all of Europe's profit-oriented leaders when coalitions of workers, craftsmen and artisans declared Paris independent from the French monarchy and showed the world that the rabble was very capable of self-governance outside the existing capitalist culture foisted upon them. They had to be utterly crushed to prevent others from undergoing the same machinations. And, brutally, in 1871 the French army-with the aid of the Prussians who had just signed a peace treaty with a defeated France, along with the political support of the rest of capitalist Europe, did just that. An alternative to capitalism was decapitated.

It's important to recognize that capitalism demands removing any ability for people sustaining themselves through other alternatives. This is an obvious lesson modern English history taught us when the population was removed from self-sustenance via land privatization while having a wage system thrust upon them. Capitalists also implemented minimal programs when necessary to thwart revolt against the system. The remedies of the Poor Laws were most active in areas bordering on revolution.

Capitalist leadership learned that making the labor market more humane through safer workplaces and providing some benefits to workers-like Bismarck's reforms in the later part of the 19th century, would ensure the system could continue operating. However, any reforms were still in the context of a system that had already commodified land and working people destroying any existing forms of collective self-sufficiency. When reforms impede the profit-making roles commodified land and labor possess, the impediments are removed, just like the Poor Laws were banished and harsher rules implemented.

This historical fact is very apparent today. The systemic pressures to remove the welfare side of state responsibilities to its citizens-obligations demanded and won through generations of workers fighting (and dying) for greater security, respect, and dignity, demanding the "right to live," have never been greater. Witness roll-backs of wage gains throughout the world, cut-backs of public safety nets under austerity as capitalists assume less responsibilities funding public programs, privatizations of public services through selloffs and private/public "partnerships," including national parks to roadways, from bridges to postal services (postal-England); even retirement programs (Chile), and fully public funded health-care services are being transformed into profit making endeavors.

I emphasize, it's not about neoliberal capitalist ideology versus a good capitalism where a paternal state will protect its people from the system's excesses. The socially destructive aspects of commodifying life have been apparent from capitalism's mercantile origins regardless of the liberal-electoral forms undertaken in the 19th and 20 th centuries that attempted to mitigate capital.

Stronger state interventions attempting to train an untrainable and individually greedy capitalism may have helped over the very recent historically anomaly-capital's golden age after the Second World War, but, not over modern history's long duration. Just as remedies were applied to social ills created by the enclosures and labor market, the modern state-intrinsically wed to capitalism, will suspend its paternalistic role whenever so demanded by capital. And under severe systemic threats, capital and its nation/state merge into a totalitarian state-run economy and society, taking fascist forms as seen following the World War I, and during the Great Depression after the 1929 market crash, and is now seen again, if under 21st century circumstances.

The same commodification processes of land and labor, and everything else it can, continue in various stages throughout the globalized world. It's the metastatic legacy of imperialism. This is where my fenced land and limping old man come in.

Consolidation of land into the hands of a few - now fueled by major corporations and investment firms (witness the voracious corporate land grabs in the food insecure areas of Africa and other parts of the world) dominate the remaining vestiges of a natural world ripe in generating profits while feeding populations of consumers. The mass migrations over the last few decades have forcibly expelled innumerable populations in a developing world from their ancient sustaining lands into cities where wage labor now provides their sustenance. Giant agribusinesses force feed existing agrarian societies with land exhausting technologies when they've been very capable of feeding themselves apart from global commodity markets while sustaining their land (unless disasters caused by global warming or profit-driven wars befall them.) Just like the brief history outlined in this essay, the globe is undergoing the necessary processes of social transformation needed to make capitalism supreme, regardless of any preexisting cultural structures.

Cultures founded on capitalism are irreformable. Mitigating reforms attempting to save societies, that's taken many generations to achieve, are being dismantled within far less time than a single generation. And answers don't mean trying to reassemble pre-capitalist pasts. These are long gone with social aspects that should be gone. Potential futures are bleak should capital remain dominant.

It will take visions of a flourishing future without capitalism from a new generation of the systemically dispossessed and disgruntled, dreams looking past the present, while recognizing how this moment came to pass by critically engaging history, and then saying no longer. Local ways will be found for working around the current system, and then networking of local successes into regional and more global alliances building new futures (local change cannot stand alone). Whatever alternatives will take place, they must not include making all things into commodities if life on the planet, as we know it, prevails.

The old man likely limps from his hard labors. A limp of mine from my past labors is now exposed.

Academia's Other Diversity Problem: Class in the Ivory Tower

By Allison L. Hurst and Alfred Vitale

"How can you know anything about the working class?" asks Ernest Everhard, the protagonist of Jack London's 1908 dystopian novel, The Iron Heel as he addresses a group of liberal do-gooders and college administrators. They can't possibly know the working class, he argues, because they don't live where the working-class live. Instead, they are paid, fed, and clothed by "the capitalist class." In return, they are expected to preach what is acceptable to that class, and to do work that will not "menace the established order of society". While this was written over 100 years ago, for many working-class academics (those of us who grew up poor or working class and climbed into academia), this conversation rings true. Many of us have presented some variation of it at one time or another to our more privileged academic colleagues.

Watching this past election cycle has been difficult for us. It has reminded us of the gap between the places we currently inhabit (the so-called Ivory Tower) and the places we originally came from, which we still visit from time to time. We cheered Bernie when he came on the scene, because he appeared to understand this gap and promised to make things better for the people we loved. When Trump began overtaking other candidates, we were not as surprised as the people around us seemed to be, because we understood that his message, cloaked as it was in misogyny, nativism, and racism, was directed at real issues long overlooked by the Democratic Party. We held our breaths, hopeful that Sanders would take down Trump, that his message was the message of change and kindness rather than change and hate. After the primaries, we crossed our fingers but felt the DNC had made a major blunder in nominating a candidate who stood for everything that people seemed to be fighting against - business as usual, neoliberalism, paternalism.

Both academia and the DNC have a class problem. They don't know anything about the working class because they have isolated themselves from working-class people. We have been struggling for years to change this within academia. In 2008, after a few years of discussion among comrades, a group of us formed the Association of Working-Class Academics (AWCA), a group for people like us with lives that straddled the working class and middle class. We wanted to bring class into the academy, to get people talking about it, aware of it, doing something about what we saw as an unsustainable growth of economic inequality. We had parents without retirement income, brothers with back-breaking jobs, sisters without the ability to pay for childcare, generations who faced joblessness or an attempt at a local college, with accompanying debt. We knew firsthand that things had shifted somewhere in the promise of the American Dream, that good jobs were harder to come by, that many people didn't have the luxury to plan and save and think about retirement. We thought that having more faculty with backgrounds like ours would provide natural checks-and-balances on academic discourses that tend to move far away from the reality of class as lived by the overwhelming majority of the population.

It hasn't exactly worked out that way. Discussion of social class has always been relegated to the margins of academia. In turn, public discourse about class is muted. By denying the opportunity for social class to be a valid academic subject in itself, or to be considered an authentic form of social identity, educated folks (academics, pundits, campaign managers, and journalists) didn't just silence the voices of the poor and working-class, they also denied the possibility of critically engaging the problem of affluence. How to critique Trump without this? His status as a member of the billionaire class was not seen as problematic, despite all we know about the power and impact that class has on the very real experiences of the vast majority of Americans. He was lampooned as a buffoon, then excoriated for his bad manners, and finally deplored for his many bad acts, all of which left the essential issue of a billionaire running on a platform of economic populism relatively unquestioned. When we saw the picture of the Trumps and the Clintons hobnobbing in evening wear, we thought, "This will nail him!" But that picture was never used by the DNC, because it would target their candidate as well. Plus, it wouldn't have been polite.

A society more sensitive to the complicity of the ruling classes, more willing to eschew the sycophancy and reverence given to the already overwhelmingly privileged, more capable of resisting the urge to lionize the affluent, and more attentive to the ubiquitous power handed over to the 1%, would have appropriately vilified Trump and dismissed him well before his name went on the ballot. We can spend time looking at any number of reasons for his victory, but we must ask the bigger question of why an unabashedly greedy billionaire glided through the primaries and general election without any real resistance. Could it be the case that we have consistently neglected to blame, unequivocally, the economic elites for inequality, and to hold them accountable for it? Where was the critical intellectual attack on the damages reaped by the excesses of the 1%? This takes us back to Jack London's protagonist Mr. Everhard, and his suggestion that such critique would "menace the established order of society." It may be true that many university researchers have studied poverty and made it their social justice duty to try to understand and ameliorate it. But the lens is most often focused downward, to poverty, and there has been virtually no research directed upward at the practices and mechanisms by which the affluent cause, exacerbate, benefit from, and rely on the steady continuation of inequality. And the occasional whispered squeaks of condemnation for the wealthy fade quickly, subsumed by the jingoistic, pragmatic liberalism of the well-educated in an academic world increasingly shaped by the whims of the donor class.

This form of economic censorship, justified by the neoliberal fabric of institutions of higher education, ensures that no acceptable critique of affluence will become sewn into the fabric of pedagogy. It is our contention that if academia was proportionately represented by faculty and students from the poor and working-classes, the influence of the donor class on the institutional structure could be counteracted at the immediate level of teaching and research as a matter of course, rather than as an occasional garnish on the obligatory "race, class, gender" courses offered in many college departments. Discourse would create a resistance to the universalizing narratives compressing "poverty" and equalize it through a reciprocal comingling with intersectional narratives condemning the oppressive presence of affluence. If social class is duly acknowledged as salient, we will have to problematize and identify the systemic sources that shape the dominant narrative. Such a critique will require an indictment of capitalism as it stands, and therein lies the problem: how can we expect a real, class-sensitive critique of affluence in a milieu that tacitly condones its pursuit and happily reaps its benefits?

But, you may be asking, is there some problem here? We all know that academia can seem far removed from the day-to-day social worlds of most people, so what does it matter if academia doesn't want to indict affluence? Let's consider this question in light of the recent failures of the Democratic Party, and its slow slide away from economic populism and into neoliberalism since the days of Bill Clinton. Let's acknowledge that the increased dismissal of social class discourse in academia coincides with the current chasm in understanding between those who run the Democratic Party and those whom it purports to represent.

In many ways, the Democratic Party is like the Ivory Tower. They have both distanced themselves from a class awareness that they profess to have-so much so that they have forgotten and refused to acknowledge what social class means to actual people in the world. They have nominally acknowledged oppression, but have not really invited the oppressed into their circles; consequently, they assume they will have the support of the oppressed when it's needed. Diversity (or, rather, the lack thereof) remains a major problem in both academia and the Democratic Party. Both the party and academia have come to rely on a cadre of affluent donors, thereby shifting their priorities to fund-raising, advancement efforts, and the doling out of reciprocal favoritism, influence and rewards to the philanthropist class.

This diminishing attention to social class, both culturally and academically, paralleled the decline of unions in the United States, the crumbling of rust belt cities, and a sweeping upswing in inequality. The poor and working-classes ceased to have even a small amount of power, and were picked clean by things like predatory lending, healthcare costs, student-loan debt and skyrocketing college costs, jobs moving overseas, and major cutbacks in the social safety net. Relatedly, while scholarly attention to other factors in human experience such as race or gender grew exponentially - and it is true that there are deliberate efforts at most universities to invite more faculty from diverse race and gender backgrounds - there remains a relative and concerning scarcity of minorities as faculty members or students, and in particular, of working class and poor faculty and students. It may be the case that the very structural class dynamics most liberal professionals have neglected could help explain why they're having such a hard time ensuring equitable racial and gendered distributions in the University and the meritocratic beyond.

Although access to higher education has helped some members of the poor and working-classes "move up" in the world (we are witnesses to that), the numbers remain stubbornly small. Our colleges continue to serve children of the elite, or at least children of the highly educated. Proportionately speaking, faculty in universities do not reflect the existing social class strata that exists outside the walls of the Ivory Tower. This is not likely to change. Many academics from poor and working-class backgrounds are in disproportionate amounts of debt because they had to pay for the academic entry-fee themselves, and the tuition prices went up as the lines got longer. As it becomes more expensive to fund a graduate education, we will continue to find a smaller percentage of employed academics that come from poor or working-class backgrounds. The academic system keeps out the rabble, as it always has. This, in turn, comforts the donor class, who are assured that their role as instrumental philanthropists (i.e., manipulative tax-avoiders) will continue to garner them the reverence that their economic power naturally deserves - all without any means for resistance by the masses.

Thus it stands that the absence of real class awareness and the glacial pace of diversity efforts plague both the Democratic Party and institutions of higher education. Perhaps both the ivory tower and the DNC shouldn't be publicly trying to recruit the poor and working-class to become members of the liberal elite, and privately insulting them if they aren't. Instead, maybe we should ask ourselves what we can do to make academia privilege the voices of disenfranchised people, rather than the elite group speaking on behalf of them. Perhaps then, maybe in 2020, our collective voices will shout to the elites the same words spoken by Jack London's Ernest Everhard:

"We know, and well we know by bitter experience, that no appeal for the right, for justice, for humanity, can ever touch you. Your hearts are hard as your heels with which you tread upon the faces of the poor. So we have preached power. By the power of our ballots on election day will we take your government away from you."


Alfred Vitale, Ph.D. and Allison L. Hurst, Ph.D. are two co-founders of the Association of Working-Class Academics, a non-profit that was recently absorbed into Working Class Studies Association.

Standardization as a Tool of Oppression: How the Education System Controls Thought and Serves as a Gatekeeper to the Ruling Elite

By Kali Ma

The "ruling elite" is a tiny minority roughly comprised of the nation's top 1% income earners who own more wealth than the bottom 95% of the population combined.[1] Those who make up this ruling elite are wealthy, mostly white, individuals. They are overwhelmingly educated at the most prestigious elite institutions and are the leaders in all major fields within society.

In order for this tiny minority to rule over the majority, it needs mechanisms in place to keep the majority from overtaking its power. Our standardized education system serves as a vital gatekeeper to the ruling class and legitimizes their power and authority. Standardization - or the use of pre-determined measures to judge individuals - is essential to controlling thought and promoting a particular ideology to the exclusion of all other perspectives. Ideology in this context means a set of values, beliefs and ideas shared by a group of individuals that reflects their economic, political, and social interests. For an ideology to become dominant, it must be accepted by the majority and serve as a lens through which most individuals view society. The more people interpret the world through a particular perspective, the more power those who benefit from that perspective gain.

Standardization is vital to perpetuating the elite's ideology and serves to: 1) legitimize the rule of those in power; 2) train individuals to obey and defer to authority, as opposed to teaching them critical thinking skills; and 3) exclude competing perspectives and people that threaten the interests of the ruling class. The education system is particularly effective in meeting these objectives because it presents itself as a system of merit where students are rewarded in proportion to their efforts. However, when we examine the education system more closely, it becomes clear that its structure heavily favors affluent individuals and those most likely to further the elite's ideology.


Legitimizing the Ruling Elite - The Myth of Meritocracy

Central to the legitimization of those in power is the myth of meritocracy, which consists of two main assumptions: 1) that individuals succeed in proportion to their abilities, and 2) that those in leadership occupy their positions because they are the most intelligent and talented individuals in society. It also asserts that anyone can attain this elite status if they possess superior abilities and talents.

As a result of these assumptions, meritocracy advances the philosophy that certain individuals are "superior," which legitimizes the rule by the "superior" few over those perceived as "inferior." This separation into "inferiors" and "superiors" takes place in our education system, which constantly ranks students based on standardized criteria. "Inferior" are those who, through inherent or self-created deficiencies, do not meet the "standard" and are, therefore, deemed unqualified or unintelligent. In other words, their voices and perspectives are silenced in favor of those who meet or exceed the standard. Persons deemed "inferior" simply become the subjects of power and thereby outsource their decision-making to a tiny privileged elite.

The most talented and intellectually "superior" individuals usually go on to attend our nation's elite universities. Contrary to the claims of meritocracy, however, students who attend these elite institutions are not necessarily more intelligent or talented, but rather enjoy the advantages of their socio-economic privilege.

Meritocracy Myth Debunked: Elite Schools and the "Intergenerational Reproduction of Privilege"

Elite universities play an essential role in generating new members for the ruling class and legitimizing their governance over the majority. Analyzing the process that produces this ruling elite is key to revealing how an affluent, mostly white, minority still remains in power today.

Instead of public schools, upper-class children attend exclusive private schools, expensive prep or boarding schools, and eventually enroll at our nation's elite universities. Throughout their lives, they are groomed to be society's leaders and are constantly reminded of their "superior destiny." As a result, they are confident about their abilities and view lower classes as subjects to be led, ruled, and guided.

The dichotomy between the upper class and everyone else becomes obvious when we examine elite institutions. According to a study, only 6.5% of Harvard students received federal financial aid in the form of Pell Grants, which are generally given to students in the bottom half of the income distribution. [2] This means that only about 6.5% of students from the bottom half of the income bracket were enrolled at Harvard during the 2008-2009 school year. Nearly three quarters of all students at elite colleges come from the top income quartile, while only 3 percent come from households in the bottom quartile. [3] The top 25% in terms of income are 25 times more likely to attend a "top tier" college than are those in the bottom 25%.[4]

Most high-achieving, low-income students outside of urban areas do not even apply to selective universities because of geographic and social barriers. [5] Many lack the basic information about "top-tier" institutions while others simply do not know anyone who attended a selective university, and likely, sense that they do not belong in these schools.[6]

Admission into elite universities heavily favors the privileged in several ways, including: preference given to family legacy students, those who can afford to pay full tuition, and students who receive high scores on standardized exams for which tutoring is essentially required and usually quite expensive.[7] "Legacy applicants" who had at least one parent graduate from an elite institution are up to 45% more likely to be admitted to that school.[8] On the other hand, a study revealed that during the admissions process, elite schools awarded zero points to low-income individuals for their socio-economic status, thus failing to acknowledge the obvious economic and social disadvantages those students had to overcome in order to achieve academic success. [9]

Clearly, privileged individuals have significant advantages when it comes to enrollment at our nation's "top tier" institutions. This, however, is not entirely the result of their own efforts as the myth of meritocracy would have us believe, but rather the socio-economic advantages tied to their affluent status. Notably, even members of the elite establishment have admitted that the system favors the wealthy: according to Anthony Carnevale - former Clinton administration appointee and current director of the Georgetown University Center on Education and the Workforce -"The education system is an increasingly powerful mechanism for the intergenerational reproduction of privilege."[10]


Standardization Teaches Unquestioning Obedience

Meritocracy also assumes that all individuals are equally situated and can therefore be properly judged by the same measures. Merit is determined by extensive use of standardized exams that evaluate students' aptitude and rank them based on criteria established by the power structure.

Most schools today do not encourage children to think critically or express themselves in their own way; instead, they teach students how to best restate what they have learned. Individuals who memorize well and are able to repeat certain facts most closely to the expected standard are considered intelligent and reward with good grades and high scores on exams. Creativity, thinking outside the box, raising questions that challenge the status quo, and engaging with the learning material in a lively manner is simply not tolerated. Very rarely are students rewarded for their own critical thinking and creativity. A system that expects students to memorize and copy a pre-determined standard does not teach critical thinking or the sharing of different ideas and perspectives - it teaches obedience.

Proponents of standardized testing claim that the exams have the ability to assess students' abilities and predict future success. Standardization teaches us early on that there is a prevailing, dominant measure by which all people can be legitimately judged. As a result, it effectively promotes only one type of assessment based on the values of the dominant ideology to the exclusion of all other measures and perspectives. In other words, students are taught to believe that only one particular set of skills is valuable and that there is only one type of "intelligence" worth expressing. Standardization is, in effect, an authoritarian mechanism that measures a student's compliance to a set of criteria or answers deemed "correct" by those in authority. There is no independent critical or analytical thinking involved, which is exactly the type of intelligence the ruling elite - who depend on an obedient and unquestioning populace - counts on.

The values the dominant ideology promotes directly and indirectly through standardization are: unquestioning obedience to authority; the importance of such obedience; the belief that only certain skills and types of intelligence are "superior"; and that those in authority are the most qualified to occupy positions of power. These values and beliefs provide great deference to authority and obviously benefit the ruling elite.

Standardized exam performance also has a considerable impact on one's future educational and life opportunities; thus, it is a highly effective mechanism for separating individuals into their respective socio-economic ranks. The fact that standardized exams produce results that disproportionately disenfranchise minorities and lower classes is key to eliminating competition and securing the power of the ruling elite.


Standardized Testing: A Mechanism for Exclusion

Keeping the ranks of power homogeneous is essential to promoting a particular ideology that benefits the ruling class. Different perspectives and "outsiders" are a direct threat unless, of course, they can be assimilated into the system and used to promote its agenda. The mechanisms by which individuals are excluded are mostly covert and appear under the cloak of meritocracy which asserts that the "best and the brightest" naturally succeed.

Exclusion Based on Economic Status, Race, and Ideology

Racial and economic inequalities are ongoing problems that have never been properly addressed. In fact, economic inequality, which disproportionately affects women and minorities, is worse today than it was during the Great Depression.[11] In addition to pure racism, sexism and classism, systemic exclusion of most minorities, women, and the poor also serves to eliminate competing political interests and exclude different perspectives that threaten the interests of the ruling class.

1. Socio-Economic Exclusion

Most universities, including elite institutions, still use standardized testing as an important factor in admissions. Test scores from the SAT show white, wealthy students consistently outperforming minorities and the economically disadvantaged by a wide margin. [12] The results imply that the most intelligent and successful individuals within our society are wealthy whites.

Based on these results we can either believe that: a) the tests are legitimate and that minorities and economically disadvantaged individuals areinherently inferior to white, wealthy students OR, b) that minorities and economically disadvantaged students are not inherently inferior, and that the tests are illegitimate as assessors of intelligence and predictors of future success. If we believe that the tests are legitimate and that students perform poorly because of financial disadvantages, then we must still reject this unfair assessment that disproportionally affects economically disadvantaged students.

According to Edwin Black, author of the War Against the Weak, standardized exams such as the SAT serve as "vehicles for cultural exclusion." [13] Research linking test performance to family income suggests that what these exams really measure are an individual's access to certain resources like test preparation classes, tutoring, and private school education. [14] A study recently found that a student's socio-economic background has a "considerable" impact on his or her secondary educational achievements, particularly in the United States.[15] Standardized testing exploits this disadvantage and efficiently keeps people in their respective socio-economic ranks.

With so much emphasis placed on standardized testing, it is the perfect tool to prevent individuals from rising above their economic statuses in a seemingly legitimate way. Generally speaking, unless a person is well-connected - which often comes with wealth and social status - they are unlikely to do much better economically than their parents.

By continuing to legitimize standardized exams, it seems that we as a society have accepted the belief - consciously or not - that wealthy (mostly white) individuals are inherently superior. Interestingly, the origins of standardized testing are grounded in this exact racist and classist belief.

2. Racial Exclusion

Standardized exams and I.Q. tests emerged in the early 1900s and were extensively promoted by the eugenics movement. [16] The premise of eugenics was that Nordic, upper class whites were inherently superior and more intelligent than other races.[17] In the 1920s, Carl Brigham, a psychologist and figure in the eugenics movement, developed the Scholastic Aptitude Test, or what is now referred to as the SAT.[18] Brigham believed that whites born in America were inherently superior and more intelligent than other races, including southern and eastern European immigrants, whom he deemed equally inferior.[19] Eugenics was widely accepted throughout America's leadership class and heavily financed by influential organizations like the Carnegie Institution and Rockefeller Foundation.[20] Over a period of about 60 years, eugenics led to the forcible sterilization of 60,000 Americans who were deemed "unfit" due to race, social status or other "defective" traits.[21]

Is it a coincidence, then, that privileged white students disproportionately outperform minorities and economically disadvantaged students on an exam created by a man who firmly believed in the superiority of white, upper class individuals? Do we honestly believe that privileged whites are inherently superior to everyone else? And what does it say about the ideology of our ruling elite when some of its most influential members like the Carnegie and Rockefeller families financed an overtly racist and classist movement that led to the forcible sterilization of 60,000 people?

It is no coincidence that standardized testing promotes a certain type of intelligence that happens to benefit white, upper class individuals. The classist and racist implications of standardized testing are evident in their origins and results. By shaping the perception that certain groups are naturally unintelligent, the system dehumanizes whole classes of people and effectively silences their voices. The results provide seemingly legitimate "proof" that minorities and the poor are inherently inferior and that they deserve to occupy a lower rank in society. In truth, however, our education system is a convenient excuse to justify the position of those in power while giving the appearance, through seemingly legitimate means, that this power was attained in a fair and just manner.

3. Ideological Exclusion

Discrimination based on race and class is an intersection of several issues: pure racism and classism as well as the elimination of competing ideologies and political interests that would - at the very least - significantly weaken the dominant ideology. The inclusion of diversity is a direct threat to the homogeneous make-up of the ruling elite, which depends on its ideology to sustain its power. Being part of the ruling elite is not just about wealth, race, and social status: it is just as much - if not more - about sharing particular ideological perspectives that advance the interests of the privileged class as a whole.

For instance, while affirmative action programs have been instrumental in providing educational opportunities for racial minorities, they have mostly helped upper class minority students.[22] The fact that these programs assist mostly privileged students further suggests that the system favors the wealthy. One reason for this is that upper class individuals share similar social and economic interests with those in power and are more likely to advance the dominant ideology because they themselves have benefited from the status quo. As a result, they are less likely to challenge existing conditions in any significant way and are not viewed as a direct threat to the system.

It is important to note that simply placing women, racial minorities, or economically disadvantaged people into positions of power does not guarantee a diversity of ideas or that our system will become any more just. We only need to look at our current leaders in various areas who, despite their minority statuses, dutifully serve the power structure. It is not about who embodies the dominant ideology, but rather what values and beliefs an individual actually represents. That is why standardization of education is such an effective tool - by imposing its own standards and values, the system shuts out all alternative perspectives that do not advance the interests of the ruling class.

"Success" within society most often reflects the extent to which a person obeys or furthers the interests of the power structure. This is true for individuals of all backgrounds and social classes. While some people from modest or minority backgrounds move up to the ranks of the privileged elite, they are few and far between and heavily underrepresented compared to their numbers within the population. Because success depends on obedience to the dominant ideology, there is a strong incentive to disregard one's own viewpoints and assimilate to the system's ideology. Obviously, not all individuals within society have identical perspectives; yet the system, nevertheless, compels most of us to suppress our unique experiences, observations, and impressions in order to prevent us from utilizing those perspectives to meaningfully challenge the status quo.

This repression is a direct consequence of standardization, which rewards obedience to authority and promotes a one-sided perspective to which all people are expected to assimilate. This is why the status quo is incredibly difficult to change: because we are induced and indoctrinated into a mindset that only benefits those in power and severely restricts our self-expression. Any perspectives or ideas that fall outside of the artificial norm are disregarded, and the people who express them often alienated or even punished.

The standardized education system is particularly effective in procuring conformity because it makes "success" dependant on obedience to the dominant ideology that represents the interests of the ruling elite.


Alternatives to Standardization

According to educators who support systemic reform, a student-centered approach to education would produce much more equitable results. [23] A more holistic model for educating students would, for instance: teach children leadership skills and social responsibility, encourage them to cooperate with their peers, challenge students to critically analyze current events, and teach them to construct well-reasoned arguments to defend their ideas.[24] This type of teaching style would actively engage students with each other and foster critical thinking that encourages various viewpoints to enter into awareness. Such lively engagement would undoubtedly reveal talents, strengths, and abilities that standardized tests are designed to disregard.

Eventually, assessment of students would become much more equitable, because each individual would express different skills and talents as opposed to being judged by a fixed, homogeneous standard. There would be no preference for one type of intelligence, which would make standardized testing irrelevant. Without standardization, the system would find it much more difficult to promote its homogeneous ideology, legitimize the rule by a tiny elite, and justify its obvious discrimination against the poor, minorities, and alternative perspectives that challenge its power.

The essential feature of standardization is that it presents information from the perspective of those in power. For instance, corporate textbooks bury important historical facts and recount events from the one-sided point of view of the ruling class - presidents, businessmen, diplomats, and generals - thereby silencing the voices of ordinary people.[25] Recognizing this disparity, the Zinn Education Project offers teaching materials to educators based on Howard Zinn's bestselling book A People's History of the United States[26] The materials introduce students to a more a comprehensive and honest version of history viewed from the perspective of ordinary people. The lesson plans focus on the history of women, working class people, Native Americans, people of color, as well as historical figures who are often mischaracterized or ignored in traditional textbooks.

One teaching strategy promoted by the Zinn Education Project focuses on role-playing during which students imagine themselves as various individuals throughout history and contemplate the circumstances and realities those people faced.[27] This creative technique encourages students to directly engage with traditionally ignored viewpoints and offers an alternative to the homogeneous (and often misleading) version of history promoted by the power structure.

As these few examples illustrate, standardized education is not the only option. There are many practical alternatives that bring education to life and teach students the necessary analytical skills essential to understanding the world and viewing it in a more complex, accurate light.


Current Education System Is About Indoctrination

Conformity to a standard severely limits our possibilities and is a devastating waste of human potential that only benefits those in power. The eugenics roots of standardized testing reveal that these exams are not harmless assessment tools, but rather instruments of oppression.

When we analyze the outcomes our current system has produced, it becomes clear that its goals are not about educating students. The education system: disenfranchises the lower classes and racial minorities; makes academic success dependent on financial resources and obedience to the dominant ideology; imposes the same standards on all individuals, as opposed to cultivating their unique talents and abilities; silences different perspectives and expressions of intelligence; imposes standards that disproportionally benefit the privileged few; and teaches students what to think instead of how to critically analyze their environment.

These poor results are not a coincidence or even a result of widespread incompetence - the system is simply designed to fail. This failure only benefits the ruling elite who continuously remains in power, is never disenfranchised, never too poor to afford education, never "inferior" enough to occupy low-ranking positions in society, and whose perspectives are never excluded or silenced from the mainstream. The actual purpose of our education system is to indoctrinate individuals into the dominant ideology and eliminate perspectives and people that challenge it in any way. This exclusion is reflected in the homogeneous ranks of power, which overwhelmingly include wealthy, mostly white individuals who share similar political, social, and economic interests.

When power is concentrated in the hands of the few, it becomes easy to maneuver and manipulate. Mechanisms such as standardized testing are introduced by those in authority and are, therefore, effortlessly implemented into the system. We rarely, if ever, question the decisions of people in power because we have been taught to obey authority and defer to its "superior" judgment.

This is how a tiny 1% elite is able to rule over the majority without overt tyranny: by controlling thought, and in turn, behavior. The standardized education system is critical to achieving this objective and thus serves as a protector and gatekeeper to those in power.



Notes

[1] Andrew Gavin Marshall, "The Shocking Amount of Wealth and Power Held by 0.001% of the World Population," AlterNet, June 12, 2013, http://www.alternet.org/economy/global-power-elite-exposed

[2] David Leonhardt, "How Elite Colleges Still Aren't Diverse," The New York Times, March 29, 2011, http://economix.blogs.nytimes.com/2011/03/29/how-elite-colleges-still-arent-diverse/?smid=tw-nytimeseconomix&seid=auto

[3] Thomas B. Edsall. "The Reproduction of Privilege", The New York Times, March 12, 2012, http://campaignstops.blogs.nytimes.com/2012/03/12/the-reproduction-of-privilege/

[4] Jerome Karabel, "The New College Try," The New York Times, September 24, 2007, https://www.nytimes.com/2007/09/24/opinion/24karabel.html

[5] Josh Freedman, "Why American Colleges Are Becoming a Force for Inequality," The Atlantic, May 16, 2013, http://www.theatlantic.com/business/archive/2013/05/why-american-colleges-are-becoming-a-force-for-inequality/275923/

[6] Marisa Treviño, Study: Low-income, high-achieving students think prominent universities are out of their league," NBCLatino, March 20, 2013, http://nbclatino.com/2013/03/20/study-low-income-high-achieving-students-think-prominent-universities-are-out-of-their-league/

[7] Kristin Rawls, "4 Ways College Admissions Committees Stack the Deck in Favor of Already Privileged Applicants," AlterNet, November 12, 2012, http://www.alternet.org/education/4-ways-college-admissions-committees-stack-deck-favor-already-privileged-applicants ,

[8] Elyse Ashburn, "Legacy's Advantage May Be Greater Than Was Thought," The Chronicle of Higher Education, January 5, 2011, https://chronicle.com/article/Legacys-Advantage-May-Be/125812/?sid=at&utm_source=at&utm_medium=en

[9] David Leonhardt, "How Elite Colleges Still Aren't Diverse," The New York Times, March 29, 2011, http://economix.blogs.nytimes.com/2011/03/29/how-elite-colleges-still-arent-diverse/?smid=tw-nytimeseconomix&seid=auto ,

[10] Thomas B. Edsall, "The Reproduction of Privilege," The New York Times, March 12, 2012, http://campaignstops.blogs.nytimes.com/2012/03/12/the-reproduction-of-privilege/

[11] Annie Lowrey, "Income Inequality May Take Toll on Growth," The New York Times, October 18, 2012, https://www.nytimes.com/2012/10/17/business/economy/income-inequality-may-take-toll-on-growth.html?_r=0

[12] Scott Jaschik, "New Evidence of Racial Bias on SAT," Inside Higher Ed, June 21, 2010, http://www.insidehighered.com/news/2010/06/21/sat

[13] Edwin Black, War Against the Weak: Eugenics and America's Campaign to Create a Master Race, (New York: Four Walls Eight Windows 2003), p. 85

[14] Sean F. Reardon, "No Rich Child Left Behind,The New York Times, April 27, 2013, http://opinionator.blogs.nytimes.com/2013/04/27/no-rich-child-left-behind/

[15] Organization for Economic Co-Operation and Development, Economic Policy Reports: Going for Growth (2010), p. 187 http://www.oecd.org/tax/public-finance/chapter%205%20gfg%202010.pdf see also Dan Froomkin, "Social Immobility: Climbing the Economic Ladder is Harder In The U.S. Than In Most European Countries," September 21, 2010, http://www.huffingtonpost.com/2010/03/17/social-immobility-climbin_n_501788.html

[16] Black, 78-83

[17] Black, xv

[18] Black, 78-83

[19] Black, 78-83

[20] Black, 40, 93-99

[21] Black, xv

[22] Richard D. Kahlenberg, "Why not an income-based affirmative action?" The Washington Post, November 8, 2012, http://articles.washingtonpost.com/2012-11-08/opinions/35503696_1_racial-preferences-race-neutral-methods-grutter

[23] Jesse Hagopian, "'Occupy Education' Debates the Gates Foundation (and Wins)," March 13, 2012, https://www.commondreams.org/view/2012/03/13-4

[24] Jesse Hagopian, "'Occupy Education' Debates the Gates Foundation (and Wins)," March 13, 2012, https://www.commondreams.org/view/2012/03/13-4

[25] Teaching A People's History: Zinn Education Project, "About the Zinn Education Project," https://www.zinnedproject.org/about/, Accessed June 18, 2013

[26] Teaching A People's History: Zinn Education Project, "About the Zinn Education Project," https://www.zinnedproject.org/about/, Accessed June 18, 2013

[27] Bill Bigelow, "A People's History, A People's Pedagogy," Zinn Education Project, https://www.zinnedproject.org/about/a-peoples-history-a-peoples-pedagogy/, Accessed June 18, 2013