david graeber

On Hierarchies and Humanity: A Review of ‘The Dawn Of Everything’ by David Graeber and David Wengrow

[Pictured: Waving across time, the Cave of Hands in Argentina, painted as far back as 11,000 B.C., reminds us that prehistory was filled with real people. Credit: R. M. Nunes / IStock / Getty Images Plus]


By Nathan Albright

Before his sudden death in September of 2020, two weeks after the completion of his final book, The Dawn of Everything, the beloved anthropologist David Graeber had demonstrated a knack for writing about the right thing at the right time. In 2011, his masterwork Debt: The First 5,000 Years, an inquiry into the relationship between debt, morality, and political power, was published just a few months before Occupy Wall St. began an international conversation on much the same. In addition to authoring the book and other writings that circulated in the movement, Graeber was actively involved from the beginning of the encampment, participating in direct actions and assembly meetings, and famously coining the Occupy mantra “We Are the 99%.” In 2013, Graeber, again, struck a chord with a short article titled On the Phenomenon of Bullshit Jobs, in which he batted around a few ideas about how a surprising number of jobs seem to be “utterly meaningless,” even to those working them, and speculated that this kind of work had become more widespread than is openly talked about. Within a week of its publication, the article was read by millions and translated into over a dozen languages, prompting countless responses, mostly from those  effusively agreeing with Graeber’s premise and wanting to share their own anecdotes of meaningless work. In 2018, Graeber expanded the article, and combined it with the testimonies that readers had sent him, along with some of his past writing on gendered labor, authority, bureaucracy, and imagination, into what became his most accessible and widely read book, Bullshit Jobs. Its main themes, especially his willingness to question what work is actually necessary, foreshadowed many of the discussions that took shape during the the early days of the Covid 19 pandemic as the public found itself split into essential and non-essential workers. In the years since, as droves of workers have quit their jobs in what is being called the Great Resignation, Graeber’s writing has been widely circulated in online communities like the Anti-work sub-reddit, an online forum where millions of users share stories of workplace abuses and encourage one another to quit their jobs and seek more meaningful and less exploited ways of living.

Years before, in 2005, long before Graeber amassed much of a readership, he wrote a pamphlet titled Fragments of an Anarchist Anthropology, which, maybe more than any other work, serves as a prequel to his final publication. In the pamphlet, he asks, if Anthropology is the academic field most familiar with the variety of social arrangements that have existed without structures of domination – those which have valued cooperation over competition, creativity over conformity, and autonomy over obedience – why does it, as a discipline, so rarely engage with social movements that are, sometimes rather aimlessly, trying to recreate just such conditions. Graeber concluded the pamphlet with a call for his fellow anthropologists to make common cause with social movements because, in short, “we have tools at our fingertips that could be of enormous importance for human freedom.” That same year, Yale University controversially declined to renew his teaching contract, a move that many saw as retaliation for his political affiliations. Sixteen years later, the posthumous release of The Dawn of Everything, coauthored with the archeologist David Wengrow, shows that the incident only sharpened Graeber’s resolve to actualize the kind of ‘anarchist anthropology’ his pamphlet had envisioned, and to do so in a way that could not be ignored. According to Wengrow, Graeber insisted on publishing each section of the book individually in peer reviewed journals before releasing the full compilation in order to head off any efforts to dismiss their findings. What the two present in the volume is a radically different vision of human history, an attempt to broaden our understanding of what we, as a species, have been, that is as doggedly hopeful as it is rigorously researched. 

Graeber and Wengrow set out from one of the oldest questions in the social sciences: “what is the origin of inequality?” They begin by turning the question on its head, instead asking, how is it that social theorists of 18th Century Europe came to be interested in the idea of inequality in the first place? The answer, they suggest, is simple: although Enlightenment thinking is often framed as the unique brainchild of individual European male genius, it was actually the result of an explosion of cultural exchange following first contact with the indigenous people of the Americas who had exposed Europeans to entirely different ways of thinking and living. Moreover, Indigenous intellectuals like the Huron-Wendat chief Kandiaronk (who the authors profile at length in the second chapter), were so horrified by the hierarchy, competitiveness, and poverty permeating European culture that they leveled scathing critiques of the inequalities they witnessed.

That this indigenous critique is to thank for enlightenment debates over inequality is plain to see in the historical record. In fact there is simply “no evidence that the Latin terms aequalitas or inaequalitas or their English, French, Spanish, German and Italian cognates were used to describe social relations at all before the time of Columbus,” not to mention that while most Western historians seem to overlook it, the fact is that most enlightenment figures openly “insisted that their ideas of individual liberty and political equality were inspired by Native American sources and examples.” It only makes sense that enlightenment thinkers would have come across indigenous critiques because, at the time, some of the most popular books circulating in Europe were summaries and accounts of cultural exchanges with Native Americans, usually in the form of dialogues and debates. In some of these debates, the European, often a Jesuit priest, argued at length against the idea of freedom as a virtue, a line of argument so utterly untenable by today’s standards that it is clear just how drastically indigenous thinking went on to shape the course of history.

The popularity of this indigenous critique left European pride bruised, and the authors suggest that reactionary thinkers scrambling for counter-arguments to protect their sense of superiority developed lines of thinking that have endured to the present day. The Economist A.J.R. Turgot, friend and colleague of Adam Smith, pioneered a line of argument that societies move through stages of development marked by forms of technology and subsistence that are progressively more sophisticated. Thus, the mere fact that some indigenous peoples were hunter gatherers meant that they were inferior.  Graeber and Wengrow point out that Turgot latched on to technology and forms of subsistence as central to superiority because, when it came to quality of life, wisdom, or happiness, equality, or freedom, eighteenth century Europe was in abysmal shape. But the emphasis on forms of subsistence and technological progress stuck, and eventually developed into its two most enduring forms in the writings of Hobbes and Rousseau. Hobbes essentially argued that indigenous people were equal only in so far as they were equally poor and stupid, that life for the so-called uncivilized was a vicious \war of all against all and modern powers vested in governments were the only thing keeping people from tearing each other apart – reasoning that remains the bedrock of conservative thought. Rousseau on the other hand, painted indigenous people as coming from a utopian state of innocence, totally unaware of inequality and un-corrupted by the kinds of inevitable technological advances, like agriculture, which had forced Europeans to lose touch with their own Eden. 

While both formulations are patently false and racist in their own unique ways, the authors note that Rousseau was not wrong in observing that something had been lost in European culture that clearly did exist among indigenous groups. Settlers, when exposed to indigenous ways of life, overwhelmingly chose to remain, or even to return after failing to re-integrate to European society. Indigenous Americans brought to European society, on the other hand, would invariably seek every opportunity to escape and return home. Even European children separated from their parents and brought to Indigenous communities would often choose to stay. But why? What was so different, so much more appealing about indigenous life? “The fact that we find it hard to imagine how such an alternative life could be endlessly engaging and interesting,” the authors suggest, “is perhaps more a reflection on the limits of our imagination than on the life itself.” Our understanding of life in other times and places, and therefore our idea of what is possible for our lives here and now, is shaped by our own narrow experiences. The authors provide an example: travel routes uncovered by archaeologists. These pathways have frequently been assumed to be trade routes, an assumption that reflects a modern obsession with markets, but, in actuality, these routes served as everything from elaborate circuits traversed by healers and entertainers, to inter-village networks for women’s gambling, to long distance vision quests for individuals guided by dreams. “When we simply guess as to what humans in other times and places might be up to,” the authors write, “we almost invariably make guesses that are far less interesting, far less quirky – in a word, far less human than what was likely going on.” The rest of the book, with this in mind, sets out to retell some of the most widely misunderstood stories that we have come to believe about our past and, with the help of some of the most recent archaeological discoveries, to uncover this more colorful, more imaginative, more human history. 

Some of the details the authors cover won’t be new to a student of Anthropology: medieval serfs worked significantly less hours than the modern office or factory worker, and “the hazelnut gatherers and cattle herders who dragged great slabs to build Stonehenge almost certainly worked less than that.” Such details were made widely known in the 1960’s in an essay titled The Original Affluent society by the anthropologist Marshal Sahlins, who would later serve as Graeber’s thesis advisor. The authors confirm that the basic tenets of the essay have held up over time, but it too provided a limited picture of what pre-agricultural life was like. For one thing, the break between pre-agricultural and agricultural was not so clean, not the “revolution” we have been taught, but a long process of experimentation. There were sometimes thousands of years between the first examples of agriculture in a region and any kind of consistent use for subsistence purposes. The first instances of farming looked a lot more like gardening, and were carried out with little effort, often in delta regions where seasonal floods would do most of the work of tilling, fertilizing, and irrigating – such liminal farming spaces also had a sort of built in resistance to measurement, allotment, or enclosure. The first farmers in these spaces, it seems, were women, and they often grew herbs or ornamental crops rather than food staples as had once been assumed. There’s even evidence that early farmers actually worked against traditional hallmarks of plant domestication in order to avoid becoming solely dependent on their own labor to produce crops. The authors use the term “play farming” to describe this millennia long process of leisurely experimentation and learning. 

More to the point, Graeber and Wengrow want to get one thing straight – the advent of agriculture was not the revolutionary social, political, cultural catalyst it has been heralded as. There is no evolution of social forms based on subsistence mode. How a group of people obtains its food doesn’t determine how it’s politically or hierarchically structured. In fact, pre-agricultural life was not made up of roving bands of hunter gatherers, but “marked, in many places, by sedentary villages and towns, some by then already ancient, as well as monumental sanctuaries and stockpiled wealth, much of it the work of ritual specialists, highly skilled artisans and architects.” Similarly the authors dispel the idea that greater scale, of a city for example, necessarily results in greater hierarchy. A revelation that upends an assumption so widespread that is long been considered common sense. 

Cities populated by tens of thousands began appearing around the world roughly six thousand years ago and there is surprisingly little evidence of any kind of hierarchy among them. Neither, it seems, were they dependent on a rural population to supply their needs, but instead relied on forms of subsistence gardening, animal husbandry, fishing, and continued hunting and gathering in surrounding areas. “The first city dwellers,” the authors write, “did not always leave a harsh footprint on the environment, or on each other.” Graeber and Wengrow provide examples of self governing cities from regions around the world including Ukraine, Turkey, China, central Mexico, Mesopotamia, the Fertile Crescent, the Indus Valley and others. The evidence that most of these early cities were non-hierarchically organized is so compelling the authors argue that the burden of proof is now on those trying to find evidence for hierarchy. 

Life in these early cities, the authors write, was one grand social experiment: large public works projects, public housing, elaborate city planning, monuments, temples and more, all before the widespread adoption of farming. In many of the largest early cities, the grandest projects, centrally located, elaborately adorned and monumentally constructed, appear to have been public meeting spaces, likely for citizen’s assemblies, neighborhood councils, and any number of other forms of direct democracy. In fact, popular councils and citizen’s assemblies were simply part of the fabric of life in early cities, even those civilizations that readers will be most familiar with like Sumeria, Akkadia, and other Mesopotamian cities as well as among the Hittites, Phoenicians, Philistines, and Israelites. “In fact,” the authors write, “it is almost impossible to find a city anywhere in the Near East that did not have some equivalent to a popular assembly – or often several assemblies … even … where traditions of monarchy ran deep.” 

In the Americas, this applies not only to pre-history, but also to relatively recent history, including the time of first contact with Europeans. Even the conquistador Hernando Cortez wrote at length about the assemblies he encountered in central Mexico, comparing it favorably to the forms of Italian democracy he was familiar with. While we tend to learn the history of the few, large-scale hierarchical empires in the Americas, like those Cortez most directly sought out, Graeber and Wengrow convincingly argue that the majority of indigenous Americans actually lived in social arrangements specifically formed in contrast to these cities, self-consciously arranged in such a way as to prevent forms of domination from emerging. The authors describe a world of constant experimentation and fluidity in the structures that did exist. Some groups, for instance, transitioned from short term rigid hierarchy during a hunting season, to total egalitarian relations during the next. The result of such fluid experimentation seems to have been a much more accepting, creative populous, where eccentricities were celebrated, and individuals could change identities, kin, even names from season to season as a a spirit of perpetual reinvention and regeneration flourished. Far from Rousseau’s naive state of innocence, the indigenous people of the Americas were keenly aware of the dangers of hierarchy and had become adept at the art of heading off any signs of individuals amassing coercive power. 

The evidence suggests that this fluidity of social forms and an avoidance of hierarchy, alien as it may now seem, was characteristic of most social life for most of human history. The authors suggest that perhaps the most guarded values common to all people were autonomy (as in, the complete freedom of the individual from domination by others) and communism (as in, ‘from each according to their abilities, to each according to their needs’) – one serving as the necessary precondition for the other. Far from the realm of idealistic fantasy, these were the social relations in cities of hundreds of thousands of people who self-governed for periods of thousands of years. In fact, there are periods in the archeological record of up to 500 years in which entire regions as large as “eastern North America” show remarkably little evidence of traumatic injuries or other forms of interpersonal violence. For the majority of human history, throughout the majority of the world, except for islands of hierarchy, people tended to resist domination and instead live in voluntary associations of mutual support.

A book with such an inspiring reinterpretation of human possibility, written by an author who has on more than one occasion presaged social movements, is a glimmer of hope in a very dark time. As the largest protest movement in US history swept across the country, raising questions of police and prison abolition, demonstrators could have used a vote of confidence from anthropologists who are keenly aware that the kind of possibilities activists are pursuing have not only existed, but thrived for millennia at a time. Abolitionists may be heartened to read the Huron-Wendat chief Kandiaronk’s views on the European penal system: “For my own part, I find it hard to see how you could be much more miserable than you already are. What kind of human, what species of creature, must Europeans be, that they have to be forced to do good, and only refrain from evil because of fear of punishment?”

As Wildcat Workers Unions, Tenant’s Unions, Citizens’ Assemblies, and Mutual Aid Networks are blossoming around the country, participants must wonder on what scale this kind of grassroots self-organization is possible. The answer, according to Graeber and Wengrow, is entire regional federations of metropolises –  as big as any hierarchical structure has ever managed and arguably with much greater success.  

What kind of a social movement could take form armed with the knowledge of the full spectrum of social forms throughout human history? Could humanity’s oldest values, autonomy and mutual-aid, flourish again? Can the violence and rot of capitalist empire really be undone? If a reader takes one thing from The Dawn of Everything, Graeber and Wengrow want it to be this: nothing in history was ever predetermined. Neither Hobbes’ mindless automatons nor Rosseau’s innocent children of Eden ever existed. History is alive with self-conscious actors, constantly negotiating the conditions of their lives and with far more possible outcomes than we were ever led to believe. But more importantly, so is the present.

Nathan Albright is a building super in Brownsville, Brooklyn. His writing can also be found in The Catholic Worker newspaper and at TheFloodmag.com.

The Hidden Violence of Immigration Bureaucracy

By Daniel Melo

There are obvious forms of violence in the US migration system--the longstanding brutality of Border Patrol; migrant deaths in the desert; the conditions in detention centers. These are all recognizable facets of enforcing borders and immigration restrictions. But there is a subtler, less understood violence that is pervasive in its bureaucracy, one that impacts the many thousands of people who are funneled through it.

The late anthropologist/anarchist scholar David Graeber took a compelling look into the nature of bureaucracy in his book The Utopia of Rules. This went beyond our usual disdain towards the absurdity of bureaucracy to peer inside its heart. There, he found violence. In a series of essays, Graeber lays out how deeply bureaucratized modern life has become, from one’s ability to get a bank transaction performed overseas to obtaining the ability to handle an ill loved one’s affairs, to immigration, to opening a barbershop. For him, this “bureaucratization” of daily life is “the imposition of impersonal rules and regulations . . .” which “can only operate if they are backed up by the threat of force.” Throughout, Graeber hones key point—no matter how innocuous or well-intentioned both regulations and regulators are, they possess significance and weight precisely because they are backed by the real physical force of the state.

Graeber goes on to argue that bureaucracy and its attendant violence have “become so omnipresent that we no longer realize we’re being threatened . . . .” This violence is so thoroughly present that it’s boring; it hardly ever enters our consciousness. He also points out that the creation and sustenance of systemic violence require very little work. In fact, incredible violence can be done to people with almost no affirmative action at all, such as the horrors of solitary confinement.

The violence Graeber identifies is present in our immigration system--preventing human movement, confining people to cages, or throwing them off “our” land. These all require force or the threat of it. The mere existence US immigration law perpetuates violence. Consider the simple example of the harm done to separated families because of visa quotas where the wait can range from years, up to decades. But statutory schemes aside, there is enough violence to go around in the pure administration of the law, independent of its unjust nature. To stretch Graeber’s analysis further--immigration bureaucracy possesses violence beyond the direct application of force on migrant bodies to more subtle, hidden forms of violence.

By way of example, consider the United States Citizenship and Immigration Services (USCIS), the least outwardly hostile of the immigration agencies that exist under the Department of Homeland Security. USCIS is the agency charged with “adjudicating requests for immigration benefits.” In practice, this means that it reviews the common kinds of applications and processes for migrants transitioning from one form of immigration status to another, e.g., spousal green card applications, DACA applications, and citizenship, to name a few. Despite this rather innocuous, paper-pushing semblance, USCIS has real power over a migrant’s future. It is the bureaucratic equivalent of Border Patrol in its own right, a gatekeeper to forms of lawful status for many migrants both within and without the US. And in its own way, is just as violent as the officers with guns at the border.

The 18-24 months it takes to process a refugee resettlement case leaves already displaced people in life-threatening precarity. Bureaucratic mix-ups and slowness result in prolonged family separations and government detainment of children at the border. Even delays in the ability to get a work permit present a migrant with the choice of breaking the law and working without authorization or being unable to sustain herself. The same is true of denials. USCIS, under the auspices of the Attorney General, has broad discretion to deny applications, even those that otherwise meet the letter of the law. In many instances where migrants, especially the undocumented, are unable to adjust their status, a denial opens them up to the violence of removal from the country. This is of course separate and apart from how delays and denials perpetuate the precarious nature and violence of living somewhere without status. It recently reached an absurd height when USCIS began rejecting applications (including those for asylum) if every box on the application was not filled in with “N/A,” even when a question was clearly inapplicable or irrelevant to the benefit sought (e.g., a 2-year-old won’t have any children).

To argue that these are not forms of violence is precisely how the system gets away with it—it paints over the real harm inflicted by the system as purely administrative and lacking in either the significance or intensity to amount to real violence. To the contrary, much like solitary confinement or the neglect of a child, these are expressions of violence, just in their least obvious form. They impart real harm to real people and do so on a largely arbitrary basis. To Graeber’s point about the bureaucratization of life, this violence not only escapes our view but the view of almost everyone who interacts with the immigration system. It is both mundane and pervasive thus has lost significance, only jarring us every now and again when we see children in cages or a father and daughter face down in the Rio Grande having drowned in an attempt to cross.

Perhaps what is most troubling about this violence is that it is completely displaced from any one person or even one entity. There is no one to hold directly responsible for it, and in this way, all escape responsibility. Its casualness is both its alibi and its greatest weapon—-to be able to ignore the harm it wreaks on others, its lawful ability to do nothing. As Slavoj Zizek points out in his book on violence—the holocausts that stem from capitalism all seem “just to have happened as the result of an ‘objective’ process, which nobody planned and executed and for which there was no ‘Capitalist Manifesto.’” It is precisely because these systems appear as both “objective” institutions that produce similarly “objective” results that gives them moral and political ground to justify themselves, often as anything but violent.

Despite its perniciousness, there might be hope yet for bureaucracy. Or better said, for the creation of institutions that are accountable for how they affect people’s lives. Even Graeber, an anarchist, readily acknowledges that certain kinds of bureaucracies have done a great deal of good in the world—“European social welfare state, with its free education and universal health care, can just be considered . . . one of the greatest achievements of human civilization.” In addition to his scathing analysis, Graeber also offers up a profound critique of what is “realistic”. Drawing on Marx, Graeber notes that “the ultimate, hidden truth of the world is that it is something that we make, and could just as easily make differently.” If bureaucracy, particularly the one that lays claim over migrant bodies, is a human construct, it’s time to do it differently.

The Bully's Pulpit: On the Elementary Structure of Domination

By David Graeber

In late February and early March 1991, during the first Gulf War, U.S. forces bombed, shelled, and otherwise set fire to thousands of young Iraqi men who were trying to flee Kuwait. There were a series of such incidents-the "Highway of Death," "Highway 8," the "Battle of Rumaila"-in which U.S. air power cut off columns of retreating Iraqis and engaged in what the military refers to as a "turkey shoot," where trapped soldiers are simply slaughtered in their vehicles. Images of charred bodies trying desperately to crawl from their trucks became iconic symbols of the war.

I have never understood why this mass slaughter of Iraqi men isn't considered a war crime. It's clear that, at the time, the U.S. command feared it might be. President George H.W. Bush quickly announced a temporary cessation of hostilities, and the military has deployed enormous efforts since then to minimize the casualty count, obscure the circumstances, defame the victims ("a bunch of rapists, murderers, and thugs," General Norman Schwarzkopf later insisted), and prevent the most graphic images from appearing on U.S. television. It's rumored that there are videos from cameras mounted on helicopter gunships of panicked Iraqis, which will never be released.

It makes sense that the elites were worried. These were, after all, mostly young men who'd been drafted and who, when thrown into combat, made precisely the decision one would wish all young men in such a situation would make: saying to hell with this, packing up their things, and going home. For this, they should be burned alive? When ISIS burned a Jordanian pilot alive last winter, it was universally denounced as unspeakably barbaric-which it was, of course. Still, ISIS at least could point out that the pilot had been dropping bombs on them. The retreating Iraqis on the "Highway of Death" and other main drags of American carnage were just kids who didn't want to fight.

But maybe it was this very refusal that's prevented the Iraqi soldiers from garnering more sympathy, not only in elite circles, where you wouldn't expect much, but also in the court of public opinion. On some level, let's face it: these men were cowards. They got what they deserved.

There seems, indeed, a decided lack of sympathy for noncombatant men in war zones. Even reports by international human rights organizations speak of massacres as being directed almost exclusively against women, children, and, perhaps, the elderly. The implication, almost never stated outright, is that adult males are either combatants or have something wrong with them. ("You mean to say there were people out there slaughtering women and children and you weren't out there defending them? What are you? Chicken?") Those who carry out massacres have been known to cynically manipulate this tacit conscription: most famously, the Bosnian Serb commanders who calculated they could avoid charges of genocide if, instead of exterminating the entire population of conquered towns and villages, they merely exterminated all males between ages fifteen and fifty-five.

But there is something more at work in circumscribing our empathy for the fleeing Iraqi massacre victims. U.S. news consumers were bombarded with accusations that they were actually a bunch of criminals who'd been personally raping and pillaging and tossing newborn babies out of incubators (unlike that Jordanian pilot, who'd merely been dropping bombs on cities full of women and children from a safe, or so he thought, altitude). We are all taught that bullies are really cowards, so we easily accept that the reverse must naturally be true as well. For most of us, the primordial experience of bullying and being bullied lurks in the background whenever crimes and atrocities are discussed. It shapes our sensibilities and our capacities for empathy in deep and pernicious ways.


Cowardice Is a Cause, Too

Most people dislike wars and feel the world would be a better place without them. Yet contempt for cowardice seems to move them on a far deeper level. After all, desertion-the tendency of conscripts called up for their first experience of military glory to duck out of the line of march and hide in the nearest forest, gulch, or empty farmhouse and then, when the column has safely passed, figure out a way to return home-is probably the greatest threat to wars of conquest. Napoleon's armies, for instance, lost far more troops to desertion than to combat. Conscript armies often have to deploy a significant percentage of their conscripts behind the lines with orders to shoot any of their fellow conscripts who try to run away. Yet even those who claim to hate war often feel uncomfortable celebrating desertion.

About the only real exception I know of is Germany, which has erected a series of monuments labeled "To the Unknown Deserter." The first and most famous, in Potsdam, is inscribed: "TO A MAN WHO REFUSED TO KILL HIS FELLOW MAN." Yet even here, when I tell friends about this monument, I often encounter a sort of instinctive wince. "I guess what people will ask is: Did they really desert because they didn't want to kill others, or because they didn't want to die themselves?" As if there's something wrong with that.

In militaristic societies like the United States, it is almost axiomatic that our enemies must be cowards-especially if the enemy can be labeled a "terrorist" (i.e., someone accused of wishing to create fear in us, to turn us, of all people, into cowards). It is then necessary to ritually turn matters around and insist that no, it is they who are actually fearful. All attacks on U.S. citizens are by definition "cowardly attacks." The second George Bush was referring to the 9/11 attacks as "cowardly acts" the very next morning. On the face of it, this is odd. After all, there's no lack of bad things one can find to say about Mohammed Atta and his confederates-take your pick, really-but surely "coward" isn't one of them. Blowing up a wedding party using an unmanned drone might be considered an act of cowardice. Personally flying an airplane into a skyscraper takes guts.

Nevertheless, the idea that one can be courageous in a bad cause seems to somehow fall outside the domain of acceptable public discourse, despite the fact that much of what passes for world history consists of endless accounts of courageous people doing awful things.


On Fundamental Flaws

Sooner or later, every project for human freedom will have to comprehend why we accept societies being ranked and ordered by violence and domination to begin with. And it strikes me that our visceral reaction to weakness and cowardice, our strange reluctance to identify with even the most justifiable forms of fear, might provide a clue.

The problem is that debate so far has been dominated by proponents of two equally absurd positions. On the one side, there are those who deny that it's possible to say anything about humans as a species; on the other, there are those who assume that the goal is to explain why it is that some humans seem to take pleasure in pushing other ones around. The latter camp almost invariably ends up spinning stories about baboons and chimps, usually to introduce the proposition that humans-or at least those of us with sufficient quantities of testosterone-inherit from our primate ancestors an inbuilt tendency toward self-aggrandizing aggression that manifests itself in war, which cannot be gotten rid of, but may be diverted into competitive market activity. On the basis of these assumptions, the cowards are those who lack a fundamental biological impulse, and it's hardly surprising that we would hold them in contempt.

There are a lot of problems with this story, but the most obvious is that it simply isn't true. The prospect of going to war does not automatically set off a biological trigger in the human male. Just consider what Andrew Bard Schmookler has referred to as "the parable of the tribes." Five societies share the same river valley. They can all live in peace only if every one of them remains peaceful. The moment one "bad apple" is introduced-say, the young men in one tribe decide that an appropriate way of handling the loss of a loved one is to go bring back some foreigner's head, or that their God has chosen them to be the scourge of unbelievers-well, the other tribes, if they don't want to be exterminated, have only three options: flee, submit, or reorganize their own societies around effectiveness in war. The logic seems hard to fault. Nevertheless, as anyone familiar with the history of, say, Oceania, Amazonia, or Africa would be aware, a great many societies simply refused to organize themselves on military lines. Again and again, we encounter descriptions of relatively peaceful communities who just accepted that every few years, they'd have to take to the hills as some raiding party of local bad boys arrived to torch their villages, rape, pillage, and carry off trophy parts from hapless stragglers. The vast majority of human males have refused to spend their time training for war, even when it was in their immediate practical interest to do so. To me, this is proof positive that human beings are not a particularly bellicose species. [1]

No one would deny, of course, that humans are flawed creatures. Just about every human language has some analogue of the English "humane" or expressions like "to treat someone like a human being," implying that simply recognizing another creature as a fellow human entails a responsibility to treat them with a certain minimum of kindness, consideration, and respect. It is obvious, however, that nowhere do humans consistently live up to that responsibility. And when we fail, we shrug and say we're "only human." To be human, then, is both to have idealsand to fail to live up to them.

If this is how humans tend to think of themselves, then it's hardly surprising that when we try to understand what makes structures of violent domination possible, we tend to look at the existence of antisocial impulses and ask: Why are some people cruel? Why do they desire to dominate others? These, however, are exactly the wrong questions to ask. Humans have an endless variety of urges. Usually, they're pulling us in any number of different directions at once. Their mere existence implies nothing.

The question we should be asking is not why people are sometimes cruel, or even why a few people are usually cruel (all evidence suggests true sadists are an extremely small proportion of the population overall), but how we have come to create institutions that encourage such behavior and that suggest cruel people are in some ways admirable-or at least as deserving of sympathy as those they push around.

Here I think it's important to look carefully at how institutions organize the reactions of the audience. Usually, when we try to imagine the primordial scene of domination, we see some kind of Hegelian master-slave dialectic in which two parties are vying for recognition from one another, leading to one being permanently trampled underfoot. We should imagine instead a three-way relation of aggressor, victim, and witness, one in which both contending parties are appealing for recognition (validation, sympathy, etc.) from someone else. The Hegelian battle for supremacy, after all, is just an abstraction. A just-so story. Few of us have witnessed two grown men duel to the death in order to get the other to recognize him as truly human. The three-way scenario, in which one party pummels another while both appeal to those around them to recognize their humanity, we've all witnessed and participated in, taking one role or the other, a thousand times since grade school.


Elementary (School) Structures of Domination

I am speaking, of course, about schoolyard bullying. Bullying, I propose, represents a kind of elementary structure of human domination. If we want to understand how everything goes wrong, this is where we should begin.

In this case too, provisos must be introduced. It would be very easy to slip back into crude evolutionary arguments. There is a tradition of thought-the Lord of the Flies tradition, we might call it-that interprets schoolyard bullies as a modern incarnation of the ancestral "killer ape," the primordial alpha male who instantly restores the law of the jungle once no longer restrained by rational adult male authority. But this is clearly false. In fact, books like Lord of the Flies are better read as meditations on the kind of calculated techniques of terror and intimidation that British public schools employed to shape upper-class children into officials capable of running an empire. These techniques did not emerge in the absence of authority; they were techniques designed to create a certain sort of cold-blooded, calculating adult male authority to begin with.

Today, most schools are not like the Eton and Harrow of William Golding's day, but even at those that boast of their elaborate anti-bullying programs, schoolyard bullying happens in a way that's in no sense at odds with or in spite of the school's institutional authority. Bullying is more like a refraction of its authority. To begin with an obvious point: children in school can't leave. Normally, a child's first instinct upon being tormented or humiliated by someone much larger is to go someplace else. Schoolchildren, however, don't have that option. If they try persistently to flee to safety, the authorities will bring them back. This is one reason, I suspect, for the stereotype of the bully as teacher's pet or hall monitor: even when it's not true, it draws on the tacit knowledge that the bully does depend on the authority of the institution in at least that one way-the school is, effectively, holding the victims in place while their tormentors hit them. This dependency on authority is also why the most extreme and elaborate forms of bullying take place in prisons, where dominant inmates and prison guards fall into alliances.

Even more, bullies are usually aware that the system is likely to punish any victim who strikes back more harshly. Just as a woman, confronted by an abusive man who may well be twice her size, cannot afford to engage in a "fair fight," but must seize the opportune moment to inflict as much as damage as possible on the man who's been abusing her-since she cannot leave him in a position to retaliate-so too must the schoolyard bullying victim respond with disproportionate force, not to disable the opponent, in this case, but to deliver a blow so decisive that it makes the antagonist hesitate to engage again.

I learned this lesson firsthand. I was scrawny in grade school, younger than my peers-I'd skipped a grade-and thus a prime target for some of the bigger kids who seemed to have developed a quasi-scientific technique of jabbing runts like me sharp, hard, and quick enough to avoid being accused of "fighting." Hardly a day went by that I was not attacked. Finally, I decided enough was enough, found my moment, and sent one particularly noxious galoot sprawling across the corridor with a well-placed blow to the head. I think I might have cracked his lip. In a way, it worked exactly as intended: for a month or two, bullies largely stayed away. But the immediate result was that we were both taken to the office for fighting, and the fact that he had struck first was determined to be irrelevant. I was found to be the guilty party and expelled from the school's advanced math and science club. (Since he was a C student, there was nothing, really, for him to be expelled from.)

"It doesn't matter who started it" are probably six of most insidious words in the English language. Of course it matters.


Crowdsourced Cruelty

Very little of this focus on the role of institutional authority is reflected in the psychological literature on bullying, which, being largely written for school authorities, assumes that their role is entirely benign. Still, recent research-of which there has been an outpouring since Columbine-has yielded, I think, a number of surprising revelations about the elementary forms of domination. Let's go deeper.

The first thing this research reveals is that the overwhelming majority of bullying incidents take place in front of an audience. Lonely, private persecution is relatively rare. Much of bullying is about humiliation, and the effects cannot really be produced without someone to witness them. Sometimes, onlookers actively abet the bully, laughing, goading, or joining in. More often, the audience is passively acquiescent. Only rarely does anyone step in to defend a classmate being threatened, mocked, or physically attacked.

When researchers question children on why they do not intervene, a minority say they felt the victim got what he or she deserved, but the majority say they didn't like what happened, and certainly didn't much like the bully, but decided that getting involved might mean ending up on the receiving end of the same treatment-and that would only make things worse. Interestingly, this is not true. Studies also show that in general, if one or two onlookers object, then bullies back off. Yet somehow most onlookers are convinced the opposite will happen. Why?

For one thing, because nearly every genre of popular fiction they are likely to be exposed to tells them it will. Comic book superheroes routinely step in to say, "Hey, stop beating on that kid"-and invariably the culprit does indeed turn his wrath on them, resulting in all sorts of mayhem. (If there is a covert message in such fiction, it is surely along the lines of: "You had better not get involved in such matters unless you are capable of taking on some monster from another dimension who can shoot lightning from its eyes.") The "hero," as deployed in the U.S. media, is largely an alibi for passivity. This first occurred to me when watching a small-town TV newscaster praising some teenager who'd jumped into a river to save a drowning child. "When I asked him why he did it," the newscaster remarked, "he said what true heroes always say, 'I just did what anyone would do under the circumstances.'" The audience is supposed to understand that, of course, this isn't true. Anyone would not do that. And that's okay. Heroes are extraordinary. It's perfectly acceptable under the same circumstances for you to just stand there and wait for a professional rescue team.

It's also possible that audiences of grade schoolers react passively to bullying because they have caught on to how adult authority operates and mistakenly assume the same logic applies to interactions with their peers. If it is, say, a police officer who is pushing around some hapless adult, then yes, it is absolutely true that intervening is likely to land you in serious trouble-quite possibly, at the wrong end of a club. And we all know what happens to "whistleblowers." (Remember Secretary of State John Kerry calling on Edward Snowden to "man up" and submit himself to a lifetime of sadistic bullying at the hands of the U.S. criminal justice system? What is an innocent child supposed to make of this?) The fates of the Mannings or Snowdens of the world are high-profile advertisements for a cardinal principle of American culture: while abusing authority may be bad, openly pointing out that someone is abusing authority is much worse-and merits the severest punishment.

A second surprising finding from recent research: bullies do not, in fact, suffer from low self-esteem. Psychologists had long assumed that mean kids were taking out their insecurities on others. No. It turns out that most bullies act like self-satisfied little pricks not because they are tortured by self-doubt, but because they actually are self-satisfied little pricks. Indeed, such is their self-assurance that they create a moral universe in which their swagger and violence becomes the standard by which all others are to be judged; weakness, clumsiness, absentmindedness, or self-righteous whining are not just sins, but provocations that would be wrong to leave unaddressed.

Here, too, I can offer personal testimony. I keenly remember a conversation with a jock I knew in high school. He was a lunk, but a good-natured one. I think we'd even gotten stoned together once or twice. One day, after rehearsing some costume drama, I thought it would be fun to walk into the dorm in Renaissance garb. As soon as he saw me, he pounced as if about to pulverize. I was so indignant I forgot to be terrified. "Matt! What the hell are you doing? Why would you want to attack me?" Matt seemed so taken aback that he forgot to continue menacing me. "But . . . you came into the dorm wearing tights!" he protested. "I mean, what did you expect?" Was Matt enacting deep-seated insecurities about his own sexuality? I don't know. Probably so. But the real question is, why do we assume his troubled mind is so important? What really matters was that he genuinely felt he was defending a social code.

In this instance, the adolescent bully was deploying violence to enforce a code of homophobic masculinity that underpins adult authority as well. But with smaller children, this is often not the case. Here we come to a third surprising finding of the psychological literature-maybe the most telling of all. At first, it's not actually the fat girl, or the boy with glasses, who is most likely to be targeted. That comes later, as bullies (ever cognizant of power relations) learn to choose their victims according to adult standards. At first, the principal criterion is how the victim reacts. The ideal victim is not absolutely passive. No, the ideal victim is one who fights back in some way but does so ineffectively, by flailing about, say, or screaming or crying, threatening to tell their mother, pretending they're going to fight and then trying to run away. Doing so is precisely what makes it possible to create a moral drama in which the audience can tell itself the bully must be, in some sense, in the right.

This triangular dynamic among bully, victim, and audience is what I mean by the deep structure of bullying. It deserves to be analyzed in the textbooks. Actually, it deserves to be set in giant neon letters everywhere: Bullying creates a moral drama in which the manner of the victim's reaction to an act of aggression can be used as retrospective justification for the original act of aggression itself.

Not only does this drama appear at the very origins of bullying in early childhood; it is precisely the aspect that endures in adult life. I call it the "you two cut it out" fallacy. Anyone who frequents social media forums will recognize the pattern. Aggressor attacks. Target tries to rise above and do nothing. No one intervenes. Aggressor ramps up attack. Target tries to rise above and do nothing. No one intervenes. Aggressor further ramps up attack.

This can happen a dozen, fifty times, until finally, the target answers back. Then, and only then, a dozen voices immediately sound, crying "Fight! Fight! Look at those two idiots going at it!" or "Can't you two just calm down and learn to see the other's point of view?" The clever bully knows that this will happen-and that he will forfeit no points for being the aggressor. He also knows that if he tempers his aggression to just the right pitch, the victim's response can itself be represented as the problem.

Nob : You're a decent chap, Jeeves, but I must say, you're a bit of an imbecile.

Jeeves : A bit of a . . . what?? What the hell do you mean by that?

Nob : See what I mean? Calm down! I said you were a decent chap. And such language! Don't you realize there are ladies present?

And what is true of social class is also true of any other form of structural inequality: hence epithets such as "shrill women," "angry black men," and an endless variety of similar terms of dismissive contempt. But the essential logic of bullying is prior to such inequalities. It is the ur-stuff of which they are made.


Stop Hitting Yourself

And this, I propose, is the critical human flaw. It's not that as a species we're particularly aggressive. It's that we tend to respond to aggression very poorly. Our first instinct when we observe unprovoked aggression is either to pretend it isn't happening or, if that becomes impossible, to equate attacker and victim, placing both under a kind of contagion, which, it is hoped, can be prevented from spreading to everybody else. (Hence, the psychologists' finding that bullies and victims tend to be about equally disliked.) The feeling of guilt caused by the suspicion that this is a fundamentally cowardly way to behave-since it is a fundamentally cowardly way to behave-opens up a complex play of projections, in which the bully is seen simultaneously as an unconquerable super-villain and a pitiable, insecure blowhard, while the victim becomes both an aggressor (a violator of whatever social conventions the bully has invoked or invented) and a pathetic coward unwilling to defend himself.

Obviously, I am offering only the most minimal sketch of complex psychodynamics. But even so, these insights may help us understand why we find it so difficult to extend our sympathies to, among others, fleeing Iraqi conscripts gunned down in "turkey shoots" by U.S. warriors. We apply the same logic we did when passively watching some childhood bully terrorizing his flailing victim: we equate aggressors and victims, insist that everyone is equally guilty (notice how, whenever one hears a report of an atrocity, some will immediately start insisting that the victims must have committed atrocities too), and just hope that by doing so, the contagion will not spread to us.

This is difficult stuff. I don't claim to understand it completely. But if we are ever going to move toward a genuinely free society, then we're going to have to recognize how the triangular and mutually constitutive relationship of bully, victim, and audience really works, and then develop ways to combat it. Remember, the situation isn't hopeless. If it were not possible to create structures-habits, sensibilities, forms of common wisdom-that do sometimes prevent the dynamic from clicking in, then egalitarian societies of any sort would never have been possible. Remember, too, how little courage is usually required to thwart bullies who are not backed up by any sort of institutional power. Most of all, remember that when the bullies really are backed up by such power, the heroes may be those who simply run away.



Notes

[1] Still, before we let adult males entirely off the hook, I should observe that the argument for military efficiency cuts two ways: even those societies whose men refuse to organize themselves effectively for war also do, in the overwhelming majority of cases, insist that women should not fight at all. This is hardly very efficient. Even if one were to concede that men are, generally speaking, better at fighting (and this is by no means clear; it depends on the type of fighting), and one were to simply choose the most able-bodied half of any given population, then some of them would be female. Anyway, in a truly desperate situation it can be suicidal not to employ every hand you've got. Nonetheless, again and again we find men-even those relatively nonbelligerent ones-deciding they would rather die than break the code saying women should never be allowed to handle weapons. No wonder we find it so difficult to sympathize with male atrocity victims: they are, to the degree that they segregate women from combat, complicit in the logic of male violence that destroyed them. But if we are trying to identify that key flaw or set of flaws in human nature that allows for that logic of male violence to exist to begin with, it leaves us with a confusing picture. We do not, perhaps, have some sort of inbuilt proclivity for violent domination. But we do have a tendency to treat those forms of violent domination that do exist-starting with that of men over women-as moral imperatives unto themselves.



This article was originally published at The Baffler