Ars Philosopha — October 9, 2014, 8:00 am

Are Humans Good or Evil?

A brief philosophical debate.

In this story: philosophers,[1] the ethics of rhesus monkey testing,[2] Friedrich Nietzsche,[3] selfish altruists,[4] animal concerns,[5] sadists,[6] Immanuel Kant,[7] and Ponzi schemers.[8] For access to Harper’s 164-year archive, subscribe here. 

Philosophers Clancy Martin and Alan Strudler debate whether humans are, as Martin argues, inherently good. “Clancy Martin sees people as ‘mostly good,’” Strudler counters. “His most persuasive reason seems quantitative: people much more often act rightly than wrongly…The idea that people are good because they do mostly good things makes sense only on a desperately low standard of goodness.”

Clancy Martin: The Confucian philosopher Mencius (382-303 BCE) tells the story of how an ordinary person feels if he sees a child fall into a well. Any person, Mencius argues, will feel alarm and distress — not because he hoped for anything from the parents, nor because he feared the anger of anyone because he failed to save the child, nor again because he wanted to enhance his own reputation by this act of modest heroism. It’s simply that, Mencius says, “All men have a mind which cannot bear to see the sufferings of others.”

Mencius’s claim is too strong. Whether we think of jihadists cutting off the heads of innocent journalists or soldiers waterboarding helpless prisoners, everywhere we look we see examples of humans not only bearing the sufferings of others, but causing them, even taking pleasure in them. Surely we are both good and evil: it’s hard to imagine an argument or an experiment that would prove that we are wholly one or the other. But that we are both good and evil doesn’t mean we are an equal mix of the two. What Mencius intends is that we are mostly good — that good is our normal state of being and evil is an exceptional one, in much the way health is our normal state of being and sickness the exception.

What is our goodness? It seems to be nothing more than the natural tendency towards sympathy, and it looks like it’s not restricted to human beings. A study found that rhesus monkeys would decline to pull a chain that delivered a heaping meal once they figured out that this action simultaneously administered a shock to another rhesus monkey. “These monkeys were literally starving themselves to prevent the shock to the conspecific,” noted one of the researchers. There is also overwhelming evidence that primates experience sympathy for animals outside their own species: Kuni, a bonobo female at Twycross Zoo in England, cared for a wounded starling that fell into her enclosure until it could be removed by the zookeeper.

To be good is fundamentally to have other people’s interests in mind, and not—as Mencius is concerned to point out (and after him the philosopher Immanuel Kant)—just because of the good they can do you in return. To be good is to have some genuinely selfless motivations. Of course, circumstances often encourage us to act in selfish and cruel ways: the world is a competitive, rough-and-tumble place, with more scarcity than supply. Yet most of us, when allowed, are naturally inclined toward sympathy and fellow-feeling. As the philosopher R.J. Hankinson pointed out to me when we were discussing the problem: “It’s not just that we do act with sympathy, we are happier when we act that way — and we don’t do it just because it makes us happy.” There’s an interesting sort of self-defeating quality about trying to be altruistic for selfish reasons. But what is rare — and looks pathological — is to act purely selfishly.

The very oddness, the counter-intuitiveness of the good, suggests a primal source of its existence in us, which in turn explains our perennial insistence upon it. Something in us rebels against the idea of abandoning goodness. Indeed there are many, whether theists or no, who would be deeply offended by the mere suggestion. Here again we see that our goodness wells up from a deep, natural spring.

To make a claim that will strike some as naïve, and others as a dangerous self-deception: we must believe man is good. We must believe so even when evidence to the contrary threatens to overwhelm us — perhaps especially when evidence to the contrary begins to overwhelm us. Bruce Bubacz, another philosopher friend of mine, recently argued with me (this is the kind of thing that happens if you befriend too many philosophers) that every man can be either good or evil: it all depends, to seize on a photographic metaphor, “on the developer and the fixer.” He reminded me of the Cherokee myth that both a good wolf and a bad wolf live in the human heart, always at battle, and the winner is “the one you feed.”

Bubacz’s way of looking at it does make sense of cases like the complicity of ordinary Germans in the Holocaust, or the escalations of evil in very recent genocides in places such as Bosnia and Sudan. But these arguments ultimately strike me as both dangerous and too easy. The reason we morally accuse evil when we see it — and are right to do so, are required to do so — is that we know Lincoln’s “better angels of our nature” are there to guide us. Rainer Maria Rilke advised his young poet, when seeking moral guidance, always to do what is most difficult. Rilke was not suggesting man is naturally evil and must overcome his uglier impulses through the struggle for goodness, but that our goodness is there to be found, with difficulty, in a world that may well encourage evil.

We should remember our heroes. Even a cynic is obliged to admit that our greatest heroes are moral exemplars. The most successful billionaires are not admired in the same way (except when, like Carnegie, Gates, and Buffet, they give back). Our most brilliant scientists and artists don’t enjoy quite the same level of admiration as those figures like Buddha, Socrates, Jesus Christ, Gandhi, Mother Theresa, and Martin Luther King Jr., who represent the finest degree of moral superiority.

But what convinces me, more than anything else, of our innate goodness is the existence of moral progress. In a world of moral relativists, the idea of moral improvement is a non-starter. Changes in moral codes are merely changes in social convention, goes the argument, and the study of morality is really a subgenre of anthropology. (One of my own favorite philosophers, Friedrich Nietzsche, held exactly this view.) But even as fine a mind as Aristotle’s once believed that slavery was necessary for a good society, which we now universally condemn. The fact that human rights are spreading in the world, slowly but undeniably, also strikes me as an undeniable improvement.

Gandhi’s oft-repeated words on the subject have that hit-your-head-with-a-stick effect that drives out my occasional skeptical melancholy about the human condition: “When I despair, I remember that all through history the ways of truth and love have always won. There have been tyrants, and murderers, and for a time they can seem invincible, but in the end they always fall.” Yes, they keep rising up again, too. But when I look at the long, often vicious fight that is human history, it seems to me that the good guys are winning—and that, perhaps more importantly, we really want them to win.

Alan Strudler: Human horror should surprise nobody. Jeffrey Dahmer kills and rapes teenagers, then eats their body parts; professional hit men coolly assassinate strangers in exchange for a little money; leaders from Hitler to Pol Pot to Assad orchestrate colossal rituals of cruelty. The list goes on. Each horror involves its own unique assault on human dignity, but they share something more striking than their differences: an appetite for suffering unique to us. Maybe there is something wrong with our species; maybe there is something wrong with each of us.

It is natural to dismiss Dahmer and the others as differing fundamentally from ordinary people. They are rapists, torturers, sadists, and we are not. It is natural to conclude that even though we have our flaws, their evil cannot be ours.

But grave moral problems inhere in us all. The wrongs for which we are generally responsible are not as sensational as Dahmer’s, but they are profound. Establishing this point requires reflection on arguments from the Princeton philosopher Peter Singer and a bit of social psychology.

Imagine that I take a stroll and see a child drowning in a shallow pond (a variation, of course, on Mencius’s child falling into the well). Nobody else is around. I can easily save the child, but don’t. I am responsible for his death. I have committed a grave wrong, perhaps the moral equivalent of murder. When I can save a life without sacrificing anything of moral significance, I must do so, Singer argues.

Across the world people are dying whom you and I can save by contributing a few dollars to Oxfam, and whom we don’t save. (If you are skeptical about this argument because you doubt Oxfam’s efficacy, ask yourself whether people would behave differently if they knew that their contribution to Oxfam would save lives. It would hardly make a difference.) Perhaps we are in ways like murderers, without their cruelty or cold-bloodedness. I have been discussing Singer’s argument with students in business and the humanities for twenty years. Most of them recognize its force, acknowledge responsibility for death around the world, shrug their shoulders, and move on. Death at a distance leaves us unfazed.

In the right circumstances, we are capable of doing the right thing. Yet we are fickle about goodness, as shown by one famous psychology experiment. Subjects coming out of a phone booth encounter a woman who has just dropped some papers, which are scattered across the ground. Some subjects help pick up the paper; others do not. The difference between the two groups is largely fixed by whether a subject found a coin that the experimenter had planted in the phone booth. Those who find the coin are happy and tend to help; people who don’t find a coin tend not to help. Other experiments get similar results even when the stakes are much higher. In some situations, most people will knowingly and voluntarily expose others to death for no good reason. People cannot be trusted.

The Georgetown philosopher Judith Lichtenberg surveys these experiments, along with Peter Singer’s arguments, and finds hope. Maybe investigations of this sort will reveal enough about our psychology so that we can arrange the world in ways to make people behave better and limit human misery. Even if she is right, her hope should trouble us. If we need a crutch — a coin in a phone booth — to make us behave well, our commitment to morality seems shallow.

If further proof is needed of the unsteady relationship between humans and morality, think about Bernard Madoff. A beloved family man, a friend and trusted advisor to leaders across the political and business world, this Ponzi schemer vaporized billions of dollars in personal life savings and institutional endowments. How can a man so apparently decent do such wrong? One answer is that the decency was a well-crafted illusion: Madoff was a monster, a cold-blooded financial predator from the start, feigning friendships in a ploy to seduce clients so that he could steal their money. A different answer is that Madoff would have led a morally unexceptional life if only he hadn’t found himself in situations that brought out the worst in him.

In his remarks above, Clancy Martin sees people as “mostly good.” His most persuasive reason seems quantitative: people much more often act rightly than wrongly. Does Martin correctly tabulate our right and wrong acts? I don’t know, and neither does anybody else. Much about humans can be measured, including our body temperatures, white blood cell counts, even the extent to which we hold one peculiar belief or another — but no technology exists for measuring the comparative frequency with which we engage in wrongful conduct, and there is no reason to think that our unaided skills of observation might yield reliable judgments on the issue. It hardly matters. Even if we concede, for the sake of argument, that good acts narrowly nose out the bad ones, it does not make Martin’s case. The idea that people are good because they do mostly good things makes sense only on a desperately low standard of goodness.

Suppose that you have a morally wonderful year, which involves endless acts of showing love to your family and kindness to strangers; as a member of the volunteer fire department, you even save a few lives. Then, one day, you see a chance to get rich by killing your wealthy uncle and accelerating your inheritance, and you seize the opportunity. Your one bad act easily outweighs a year’s good acts. Your claim to decency fails. This shows that there is a basic asymmetry between being good and being bad. One must do much good to be good, but doing just a little bad trumps that good.

Indeed, talk about good and bad seems strangely untethered to empirical facts. Perhaps the point of such talk, Martin tells us, is pragmatic rather than descriptive; it aims to motivate and exhort us, not to deliver any truths. If we think of ourselves as good, it will improve our prospects for becoming so. But again, recent history has not been kind to the pragmatic argument.

Remember Liar’s Poker, Michael Lewis’s recounting of the fall of Salomon Brothers. At the center of the tale is Salomon CEO John Gutfreund, who passively watches his firm get destroyed by fraud. One of Gutfreund’s most salient traits was his perception of himself as morally superior, a man who would never tolerate wrong. Because he thought he was too good to do wrong, Gutfreund refused to recognize the depth of the wrong in which he was involved, and disaster followed. When not rooted in facts, moral confidence dulls our senses. We should try to do our best, but that requires taking our evil seriously.

Share
Single Page

More from Clancy Martin:

Conversation March 30, 2015, 2:45 pm

On Death

“I think that the would-be suicide needs, more than anything else, to talk to a person like you, who has had to fight for life.”

Ars Philosopha March 14, 2014, 12:46 pm

On Hypocrisy

Should we condemn hypocrites, when we can’t help but be hypocrites ourselves?

Ars Philosopha June 25, 2013, 2:00 pm

On Suicide

And why we should talk more about it

Get access to 168 years of
Harper’s for only $45.99

United States Canada

CATEGORIES

THE CURRENT ISSUE

October 2018

FEATURED ON HARPERS.ORG

Article
The Printed Word in Peril·

= Subscribers only.
Sign in here.
Subscribe here.

In February, at an event at the 92nd Street Y’s Unterberg Poetry Center in New York, while sharing the stage with my fellow British writer Martin Amis and discussing the impact of screen-based reading and bidirectional digital media on the Republic of Letters, I threw this query out to an audience that I estimate was about three hundred strong: “Have any of you been reading anything by Norman Mailer in the past year?” After a while, one hand went up, then another tentatively semi-elevated. Frankly I was surprised it was that many. Of course, there are good reasons why Mailer in particular should suffer posthumous obscurity with such alacrity: his brand of male essentialist braggadocio is arguably extraneous in the age of Trump, Weinstein, and fourth-wave feminism. Moreover, Mailer’s brilliance, such as it was, seemed, even at the time he wrote, to be sparks struck by a steely intellect against the tortuous rocks of a particular age, even though he labored tirelessly to the very end, principally as the booster of his own reputation.

It’s also true that, as J. G. Ballard sagely remarked, for a writer, death is always a career move, and for most of us the move is a demotion, as we’re simultaneously lowered into the grave and our works into the dustbin. But having noted all of the above, it remains the case that Mailer’s death coincided with another far greater extinction: that of the literary milieu in which he’d come to prominence and been sustained for decades. It’s a milieu that I hesitate to identify entirely with what’s understood by the ringing phrase “the Republic of Letters,” even though the overlap between the two was once great indeed; and I cannot be alone in wondering what will remain of the latter once the former, which not long ago seemed so very solid, has melted into air.

What I do feel isolated in—if not entirely alone in—is my determination, as a novelist, essayist, and journalist, not to rage against the dying of literature’s light, although it’s surprising how little of this there is, but merely to examine the great technological discontinuity of our era, as we pivot from the wave to the particle, the fractal to the fungible, and the mechanical to the computable. I first began consciously responding, as a literary practitioner, to the manifold impacts of ­BDDM in the early 2000s—although, being the age I am, I have been feeling its effects throughout my working life—and I first started to write and speak publicly about it around a decade ago. Initially I had the impression I was being heard out, if reluctantly, but as the years have passed, my attempts to limn the shape of this epochal transformation have been met increasingly with outrage, and even abuse, in particular from my fellow writers.

As for my attempts to express the impact of the screen on the page, on the actual pages of literary novels, I now understand that these were altogether irrelevant to the requirement of the age that everything be easier, faster, and slicker in order to compel the attention of screen viewers. It strikes me that we’re now suffering collectively from a “tyranny of the virtual,” since we find ourselves unable to look away from the screens that mediate not just print but, increasingly, reality itself.

Photograph (detail) by Ellen Cantor from her Prior Pleasures series © The artist. Courtesy dnj Gallery, Santa Monica, California
Article
Among Britain’s Anti-Semites·

= Subscribers only.
Sign in here.
Subscribe here.

This is the story of how the institutions of British Jewry went to war with Jeremy Corbyn, the leader of the Labour Party. Corbyn is another feather in the wind of populism and a fragmentation of the old consensus and politesse. He was elected to the leadership by the party membership in 2015, and no one was more surprised than he. Between 1997 and 2010, Corbyn voted against his own party 428 times. He existed as an ideal, a rebuke to the Blairite leadership, and the only wise man on a ship of fools. His schtick is that of a weary, kindly, socialist Father Christmas, dragged from his vegetable patch to create a utopia almost against his will. But in 2015 the ideal became, reluctantly, flesh. Satirists mock him as Jesus Christ, and this is apt. But only just. He courts sainthood, and if you are very cynical you might say that, like Christ, he shows Jews what they should be. He once sat on the floor of a crowded train, though he was offered a first-class seat, possibly as a private act of penance to those who had, at one time or another, had no seat on a train.

When Corbyn became leader of the Labour Party, the British media, who are used to punching socialists, crawled over his record and found much to alarm the tiny Jewish community of 260,000. Corbyn called Hez­bollah “friends” and said Hamas, also his “friends,” were devoted “to long-term peace and social justice.” (He later said he regretted using that language.) He invited the Islamist leader Raed Salah, who has accused Jews of killing Christian children to drink their blood, to Parliament, and opposed his extradition. Corbyn is also a patron of the Palestine Solidarity Campaign and a former chair of Stop the War, at whose rallies they chant, “From the river to the sea / Palestine will be free.” (There is no rhyme for what will happen to the Jewish population in this paradise.) He was an early supporter of the Boycott, Divestment, and Sanctions (BDS) movement and its global campaign to delegitimize Israel and, through the right of return for Palestinians, end its existence as a Jewish state. (His office now maintains that he does not support BDS. The official Labour Party position is for a two-state solution.) In the most recent general election, only 13 percent of British Jews intended to vote Labour.

Corbyn freed something. The scandals bloomed, swiftly. In 2016 Naz Shah, Labour MP for Bradford West, was suspended from the party for sharing a Facebook post that suggested Israel be relocated to the United States. She apologized publicly, was reinstated, and is now a shadow women and equalities minister. Ken Livingstone, the former mayor of London and a political supporter of Corbyn, appeared on the radio to defend Shah and said, “When Hitler won his election in 1932, his policy then was that Jews should be moved to Israel. He was supporting Zionism before he went mad and ended up killing six million Jews.” For this comment, Livingstone was suspended from the party.

A protest against anti-Semitism in the Labour Party in Parliament Square, London, March 26, 2018 (detail) © Yui Mok/PA Images/Getty Images
Article
Nothing but Gifts·

= Subscribers only.
Sign in here.
Subscribe here.

If necessity is the stern but respectable mother of invention, then perhaps desperation is the derelict father of subterfuge. That was certainly the case when I moved to Seattle in 1979.

Though I’d lived there twice during the previous five years, I wasn’t prepared for the economic boom I found upon this latest arrival. Not only had rent increased sharply in all but the most destitute neighborhoods, landlords now routinely demanded first, last, and a hefty security deposit, which meant I was short by about fifty percent. Over the first week or so, I watched with mounting anxiety as food, gas, and lodging expenses reduced the meager half I did have to a severely deficient third. To make matters even more nerve-racking, I was relocating with my nine-year-old son, Ezra. More than my well-being was at stake.

A veteran of cold, solitary starts in strange cities, I knew our best hope wasn’t the classifieds, and certainly not an agency, but the serendipity of the streets—handmade for rent signs, crowded bulletin boards in laundromats and corner grocery stores, passersby on the sidewalk; I had to exploit every opportunity that might present itself, no matter how oblique or improbable. In Eastlake, at the edge of Lake Union between downtown Seattle and the University District, I spied a shabby but vacant one-story house on the corner of a block that was obviously undergoing transition—overgrown lots and foundation remnants where other houses once stood—and that had at least one permanent feature most right-minded people would find forbidding: an elevated section of Interstate 5 just across the street, attended by the incessant roar of cars and trucks. The house needed a new roof, a couple of coats of paint, and, judging by what Ezra and I could detect during a furtive inspection, major repair work inside, including replacing damaged plaster-and-lath walls with sheetrock. All of this, from my standpoint, meant that I might have found a solution to my dilemma.

The next step was locating the owner, a roundabout process that eventually required a trip to the tax assessor’s office. I called the person listed on the rolls and made an appointment. Then came the moment of truth, or, more precisely, untruth, when dire circumstance begot strategic deception. I’d never renovated so much as a closet, but that didn’t stop me from declaring confidently that I possessed both the skills and the willingness to restore the entire place to a presentable—and, therefore, rentable—state in exchange for being able to live there for free, with the length of stay to be determined as work progressed. To my immense relief, the pretense was well received. Indeed, the owner also seemed relieved, if a bit surprised, that he’d have seemingly trustworthy tenants; homeless people who camped beneath the freeway, he explained, had repeatedly broken into the house and used it for all manner of depravity. Telling myself that inspired charlatanry is superior to mundane trespassing—especially this instance of charlatanry, which would yield some actual good—I accepted the keys from my new landlord.

Photograph (detail) © Larry Towell/Magnum Photos
Article
Checkpoint Nation·

= Subscribers only.
Sign in here.
Subscribe here.

Laura Sandoval threaded her way through idling taxis and men selling bottles of water toward the entrance of the Cordova International Bridge, which links Ciudad Juárez, Mexico, to El Paso, Texas. Earlier that day, a bright Saturday in December 2012, Sandoval had crossed over to Juárez to console a friend whose wife had recently died. She had brought him a few items he had requested—eye drops, the chimichangas from Allsup’s he liked—and now that her care package had been delivered, she was in a hurry to get back to the Texas side, where she’d left her car. She had a …
Checkpoint on I-35 near Encinal, Texas (detail) © Gabriella Demczuk

Number of toilet seats at the EU Parliament building in Brussels that a TV station had tested for cocaine:

46

Happiness creates a signature smell in human sweat that can induce happiness in those who smell it.

Trump struggles to pronounce “anonymous”; a Sackler stands to profit from a new drug to treat opioid addiction; housing development workers in the Bronx are accused of having orgies on the clock

Subscribe to the Weekly Review newsletter. Don’t worry, we won’t sell your email address!

HARPER’S FINEST

Happiness Is a Worn Gun

By

Illustration by Stan Fellows

Illustration by Stan Fellows

“Nowadays, most states let just about anybody who wants a concealed-handgun permit have one; in seventeen states, you don’t even have to be a resident. Nobody knows exactly how many Americans carry guns, because not all states release their numbers, and even if they did, not all permit holders carry all the time. But it’s safe to assume that as many as 6 million Americans are walking around with firearms under their clothes.”

Subscribe Today