Ars Philosopha — October 9, 2014, 8:00 am

Are Humans Good or Evil?

A brief philosophical debate.

In this story: philosophers,[1] the ethics of rhesus monkey testing,[2] Friedrich Nietzsche,[3] selfish altruists,[4] animal concerns,[5] sadists,[6] Immanuel Kant,[7] and Ponzi schemers.[8] For access to Harper’s 164-year archive, subscribe here. 

Philosophers Clancy Martin and Alan Strudler debate whether humans are, as Martin argues, inherently good. “Clancy Martin sees people as ‘mostly good,’” Strudler counters. “His most persuasive reason seems quantitative: people much more often act rightly than wrongly…The idea that people are good because they do mostly good things makes sense only on a desperately low standard of goodness.”

Clancy Martin: The Confucian philosopher Mencius (382-303 BCE) tells the story of how an ordinary person feels if he sees a child fall into a well. Any person, Mencius argues, will feel alarm and distress — not because he hoped for anything from the parents, nor because he feared the anger of anyone because he failed to save the child, nor again because he wanted to enhance his own reputation by this act of modest heroism. It’s simply that, Mencius says, “All men have a mind which cannot bear to see the sufferings of others.”

Mencius’s claim is too strong. Whether we think of jihadists cutting off the heads of innocent journalists or soldiers waterboarding helpless prisoners, everywhere we look we see examples of humans not only bearing the sufferings of others, but causing them, even taking pleasure in them. Surely we are both good and evil: it’s hard to imagine an argument or an experiment that would prove that we are wholly one or the other. But that we are both good and evil doesn’t mean we are an equal mix of the two. What Mencius intends is that we are mostly good — that good is our normal state of being and evil is an exceptional one, in much the way health is our normal state of being and sickness the exception.

What is our goodness? It seems to be nothing more than the natural tendency towards sympathy, and it looks like it’s not restricted to human beings. A study found that rhesus monkeys would decline to pull a chain that delivered a heaping meal once they figured out that this action simultaneously administered a shock to another rhesus monkey. “These monkeys were literally starving themselves to prevent the shock to the conspecific,” noted one of the researchers. There is also overwhelming evidence that primates experience sympathy for animals outside their own species: Kuni, a bonobo female at Twycross Zoo in England, cared for a wounded starling that fell into her enclosure until it could be removed by the zookeeper.

To be good is fundamentally to have other people’s interests in mind, and not—as Mencius is concerned to point out (and after him the philosopher Immanuel Kant)—just because of the good they can do you in return. To be good is to have some genuinely selfless motivations. Of course, circumstances often encourage us to act in selfish and cruel ways: the world is a competitive, rough-and-tumble place, with more scarcity than supply. Yet most of us, when allowed, are naturally inclined toward sympathy and fellow-feeling. As the philosopher R.J. Hankinson pointed out to me when we were discussing the problem: “It’s not just that we do act with sympathy, we are happier when we act that way — and we don’t do it just because it makes us happy.” There’s an interesting sort of self-defeating quality about trying to be altruistic for selfish reasons. But what is rare — and looks pathological — is to act purely selfishly.

The very oddness, the counter-intuitiveness of the good, suggests a primal source of its existence in us, which in turn explains our perennial insistence upon it. Something in us rebels against the idea of abandoning goodness. Indeed there are many, whether theists or no, who would be deeply offended by the mere suggestion. Here again we see that our goodness wells up from a deep, natural spring.

To make a claim that will strike some as naïve, and others as a dangerous self-deception: we must believe man is good. We must believe so even when evidence to the contrary threatens to overwhelm us — perhaps especially when evidence to the contrary begins to overwhelm us. Bruce Bubacz, another philosopher friend of mine, recently argued with me (this is the kind of thing that happens if you befriend too many philosophers) that every man can be either good or evil: it all depends, to seize on a photographic metaphor, “on the developer and the fixer.” He reminded me of the Cherokee myth that both a good wolf and a bad wolf live in the human heart, always at battle, and the winner is “the one you feed.”

Bubacz’s way of looking at it does make sense of cases like the complicity of ordinary Germans in the Holocaust, or the escalations of evil in very recent genocides in places such as Bosnia and Sudan. But these arguments ultimately strike me as both dangerous and too easy. The reason we morally accuse evil when we see it — and are right to do so, are required to do so — is that we know Lincoln’s “better angels of our nature” are there to guide us. Rainer Maria Rilke advised his young poet, when seeking moral guidance, always to do what is most difficult. Rilke was not suggesting man is naturally evil and must overcome his uglier impulses through the struggle for goodness, but that our goodness is there to be found, with difficulty, in a world that may well encourage evil.

We should remember our heroes. Even a cynic is obliged to admit that our greatest heroes are moral exemplars. The most successful billionaires are not admired in the same way (except when, like Carnegie, Gates, and Buffet, they give back). Our most brilliant scientists and artists don’t enjoy quite the same level of admiration as those figures like Buddha, Socrates, Jesus Christ, Gandhi, Mother Theresa, and Martin Luther King Jr., who represent the finest degree of moral superiority.

But what convinces me, more than anything else, of our innate goodness is the existence of moral progress. In a world of moral relativists, the idea of moral improvement is a non-starter. Changes in moral codes are merely changes in social convention, goes the argument, and the study of morality is really a subgenre of anthropology. (One of my own favorite philosophers, Friedrich Nietzsche, held exactly this view.) But even as fine a mind as Aristotle’s once believed that slavery was necessary for a good society, which we now universally condemn. The fact that human rights are spreading in the world, slowly but undeniably, also strikes me as an undeniable improvement.

Gandhi’s oft-repeated words on the subject have that hit-your-head-with-a-stick effect that drives out my occasional skeptical melancholy about the human condition: “When I despair, I remember that all through history the ways of truth and love have always won. There have been tyrants, and murderers, and for a time they can seem invincible, but in the end they always fall.” Yes, they keep rising up again, too. But when I look at the long, often vicious fight that is human history, it seems to me that the good guys are winning—and that, perhaps more importantly, we really want them to win.

Alan Strudler: Human horror should surprise nobody. Jeffrey Dahmer kills and rapes teenagers, then eats their body parts; professional hit men coolly assassinate strangers in exchange for a little money; leaders from Hitler to Pol Pot to Assad orchestrate colossal rituals of cruelty. The list goes on. Each horror involves its own unique assault on human dignity, but they share something more striking than their differences: an appetite for suffering unique to us. Maybe there is something wrong with our species; maybe there is something wrong with each of us.

It is natural to dismiss Dahmer and the others as differing fundamentally from ordinary people. They are rapists, torturers, sadists, and we are not. It is natural to conclude that even though we have our flaws, their evil cannot be ours.

But grave moral problems inhere in us all. The wrongs for which we are generally responsible are not as sensational as Dahmer’s, but they are profound. Establishing this point requires reflection on arguments from the Princeton philosopher Peter Singer and a bit of social psychology.

Imagine that I take a stroll and see a child drowning in a shallow pond (a variation, of course, on Mencius’s child falling into the well). Nobody else is around. I can easily save the child, but don’t. I am responsible for his death. I have committed a grave wrong, perhaps the moral equivalent of murder. When I can save a life without sacrificing anything of moral significance, I must do so, Singer argues.

Across the world people are dying whom you and I can save by contributing a few dollars to Oxfam, and whom we don’t save. (If you are skeptical about this argument because you doubt Oxfam’s efficacy, ask yourself whether people would behave differently if they knew that their contribution to Oxfam would save lives. It would hardly make a difference.) Perhaps we are in ways like murderers, without their cruelty or cold-bloodedness. I have been discussing Singer’s argument with students in business and the humanities for twenty years. Most of them recognize its force, acknowledge responsibility for death around the world, shrug their shoulders, and move on. Death at a distance leaves us unfazed.

In the right circumstances, we are capable of doing the right thing. Yet we are fickle about goodness, as shown by one famous psychology experiment. Subjects coming out of a phone booth encounter a woman who has just dropped some papers, which are scattered across the ground. Some subjects help pick up the paper; others do not. The difference between the two groups is largely fixed by whether a subject found a coin that the experimenter had planted in the phone booth. Those who find the coin are happy and tend to help; people who don’t find a coin tend not to help. Other experiments get similar results even when the stakes are much higher. In some situations, most people will knowingly and voluntarily expose others to death for no good reason. People cannot be trusted.

The Georgetown philosopher Judith Lichtenberg surveys these experiments, along with Peter Singer’s arguments, and finds hope. Maybe investigations of this sort will reveal enough about our psychology so that we can arrange the world in ways to make people behave better and limit human misery. Even if she is right, her hope should trouble us. If we need a crutch — a coin in a phone booth — to make us behave well, our commitment to morality seems shallow.

If further proof is needed of the unsteady relationship between humans and morality, think about Bernard Madoff. A beloved family man, a friend and trusted advisor to leaders across the political and business world, this Ponzi schemer vaporized billions of dollars in personal life savings and institutional endowments. How can a man so apparently decent do such wrong? One answer is that the decency was a well-crafted illusion: Madoff was a monster, a cold-blooded financial predator from the start, feigning friendships in a ploy to seduce clients so that he could steal their money. A different answer is that Madoff would have led a morally unexceptional life if only he hadn’t found himself in situations that brought out the worst in him.

In his remarks above, Clancy Martin sees people as “mostly good.” His most persuasive reason seems quantitative: people much more often act rightly than wrongly. Does Martin correctly tabulate our right and wrong acts? I don’t know, and neither does anybody else. Much about humans can be measured, including our body temperatures, white blood cell counts, even the extent to which we hold one peculiar belief or another — but no technology exists for measuring the comparative frequency with which we engage in wrongful conduct, and there is no reason to think that our unaided skills of observation might yield reliable judgments on the issue. It hardly matters. Even if we concede, for the sake of argument, that good acts narrowly nose out the bad ones, it does not make Martin’s case. The idea that people are good because they do mostly good things makes sense only on a desperately low standard of goodness.

Suppose that you have a morally wonderful year, which involves endless acts of showing love to your family and kindness to strangers; as a member of the volunteer fire department, you even save a few lives. Then, one day, you see a chance to get rich by killing your wealthy uncle and accelerating your inheritance, and you seize the opportunity. Your one bad act easily outweighs a year’s good acts. Your claim to decency fails. This shows that there is a basic asymmetry between being good and being bad. One must do much good to be good, but doing just a little bad trumps that good.

Indeed, talk about good and bad seems strangely untethered to empirical facts. Perhaps the point of such talk, Martin tells us, is pragmatic rather than descriptive; it aims to motivate and exhort us, not to deliver any truths. If we think of ourselves as good, it will improve our prospects for becoming so. But again, recent history has not been kind to the pragmatic argument.

Remember Liar’s Poker, Michael Lewis’s recounting of the fall of Salomon Brothers. At the center of the tale is Salomon CEO John Gutfreund, who passively watches his firm get destroyed by fraud. One of Gutfreund’s most salient traits was his perception of himself as morally superior, a man who would never tolerate wrong. Because he thought he was too good to do wrong, Gutfreund refused to recognize the depth of the wrong in which he was involved, and disaster followed. When not rooted in facts, moral confidence dulls our senses. We should try to do our best, but that requires taking our evil seriously.

Share
Single Page

More from Clancy Martin:

Conversation March 30, 2015, 2:45 pm

On Death

“I think that the would-be suicide needs, more than anything else, to talk to a person like you, who has had to fight for life.”

Ars Philosopha March 14, 2014, 12:46 pm

On Hypocrisy

Should we condemn hypocrites, when we can’t help but be hypocrites ourselves?

Ars Philosopha June 25, 2013, 2:00 pm

On Suicide

And why we should talk more about it

Get access to 168 years of
Harper’s for only $45.99

United States Canada

CATEGORIES

THE CURRENT ISSUE

January 2019

Machine Politics

= Subscribers only.
Sign in here.
Subscribe here.

Polar Light

= Subscribers only.
Sign in here.
Subscribe here.

Donald Trump Is a Good President

= Subscribers only.
Sign in here.
Subscribe here.

Resistances

= Subscribers only.
Sign in here.
Subscribe here.

Long Shot

= Subscribers only.
Sign in here.
Subscribe here.

view Table Content

FEATURED ON HARPERS.ORG

Article
Machine Politics·

= Subscribers only.
Sign in here.
Subscribe here.

“The Goliath of totalitarianism will be brought down by the David of the microchip,” Ronald Reagan said in 1989. He was speaking to a thousand British notables in London’s historic Guildhall, several months before the fall of the Berlin Wall. Reagan proclaimed that the world was on the precipice of “a new era in human history,” one that would bring “peace and freedom for all.” Communism was crumbling, just as fascism had before it. Liberal democracies would soon encircle the globe, thanks to the innovations of Silicon Valley. “I believe,” he said, “that more than armies, more than diplomacy, more than the best intentions of democratic nations, the communications revolution will be the greatest force for the advancement of human freedom the world has ever seen.”

At the time, most everyone thought Reagan was right. The twentieth century had been dominated by media that delivered the same material to millions of people at the same time—radio and newspapers, movies and television. These were the kinds of one-to-many, top-down mass media that Orwell’s Big Brother had used to stay in power. Now, however, Americans were catching sight of the internet. They believed that it would do what earlier media could not: it would allow people to speak for themselves, directly to one another, around the world. “True personalization is now upon us,” wrote MIT professor Nicholas Negroponte in his 1995 bestseller Being Digital. Corporations, industries, and even whole nations would soon be transformed as centralized authorities were demolished. Hierarchies would dissolve and peer-to-peer collaborations would take their place. “Like a force of nature,” wrote Negroponte, “the digital age cannot be denied or stopped.”

Illustration (detail) by Lincoln Agnew
Article
Long Shot·

= Subscribers only.
Sign in here.
Subscribe here.

Ihave had many names, but as a sniper I went by Azad, which means “free” or “freedom” in Kurdish. I had been fighting for sixteen months in Kurdish territory in northern Syria when in April 2015 I was asked to leave my position on the eastern front, close to the Turkish border, and join an advance on our southwestern one. Eight months earlier, we had been down to our last few hundred yards, and, outnumbered five to one, had made a last stand in Kobanî. In January, after more than four months of fighting street-to-street and room-by-room, we recaptured the town and reversed what was, until then, an unstoppable jihadi tide. In the battles since, we had pushed ­ISIS far enough in every direction that crossing our territory was no longer a short dash through the streets but a five-hour drive across open country. As we set out to the north, I could make out the snowy peaks in southern Turkey where they say Noah once beached his ark. Below them, rolling toward us, were the wide, grassy valleys and pine forests of Mesopotamia, the land between the Euphrates and the Tigris where our people have lived for twelve thousand years.

The story of my people is filled with bitter ironies. The Kurds are one of the world’s oldest peoples and, as pioneers of agriculture, were once among its most advanced. Though the rest of the world now largely overlooks that it was Kurds who were among the first to create a civilization, the evidence is there. In 1995, German archaeologists began excavating a temple at Göbekli Tepe in northern Kurdistan. They found a structure flanked by stone pillars carved with bulls, foxes, and cranes, which they dated to around 10,000 bce. At the end of the last Ice Age, and seven thousand years before the erection of Stonehenge or the pyramids at Giza, my ancestors were living together as shamans, artists, farmers, and engineers.

Fighters of the YJA-STAR, the women’s force in the PKK, Sinjar, Iraq, November 2015 (detail)
Article
Polar Light·

= Subscribers only.
Sign in here.
Subscribe here.

To get oriented here is difficult. The light is flat because the sky is overcast. The sun’s weak rays create only a few anemic shadows by which to judge scale and distance. Far-off objects like mountain peaks have crisp edges because the atmosphere itself is as transparent as first-water diamonds, but the mountains are not nearly as close as they seem. It’s about negative-twelve degrees Fahrenheit, but the wind is relatively calm, moving over the snow distractedly, like an animal scampering.

[caption id="attachment_271890" align="aligncenter" width="690"]True-color satellite image of Earth centered on the South Pole during winter solstice © Planet Observer/Universal Images Group/Getty Images. True-color satellite image of Earth centered on the South Pole during winter solstice © Planet Observer/Universal Images Group/Getty Images.[/caption]

Four of the six people living here are in their tents now, next to their cookstoves, two by two, warming up and preparing their suppers. I’m the fifth of the group, almost motionless at the moment, a hundred yards south of the tent cluster, kneeling on a patch of bluish ice in the midst of a great expanse of white. I’m trying to discern a small object entombed there a few inches below the surface. Against the porcelain whites of this gently sloping landscape, I must appear starkly apparent in my cobalt blue parka and wind pants. I shift slowly right and left, lean slightly forward, then settle back, trying to get the fluxless sunlight to reveal more of the shape and texture of the object.

A multiple-exposure photograph (detail) taken every hour from 1:30 pm on December 8, 1965, to 10:10 am on December 9, 1965, showing the sun in its orbit above the South Pole, Amundsen–Scott South Pole Station © Georg Gerster/Panos Pictures
Article
Donald Trump Is a Good President·

= Subscribers only.
Sign in here.
Subscribe here.

In all sincerity, I like Americans a lot; I’ve met many lovely people in the United States, and I empathize with the shame many Americans (and not only “New York intellectuals”) feel at having such an appalling clown for a leader.

However, I have to ask—and I know what I’m requesting isn’t easy for you—that you consider things for a moment from a non-American point of view. I don’t mean “from a French point of view,” which would be asking too much; let’s say, “from the point of view of the rest of the world.”On the numerous occasions when I’ve been questioned about Donald Trump’s election, I’ve replied that I don’t give a shit. France isn’t Wyoming or Arkansas. France is an independent country, more or less, and will become totally independent once again when the European Union is dissolved (the sooner, the better).

Illustration (detail) by Ricardo Martínez
Article
Resistances·

= Subscribers only.
Sign in here.
Subscribe here.

The prepositions you’re most likely to encounter after the title of a poem are “for” or “to” and sometimes “after”—“for my daughter”; “to Bobby”; “after Pound”; etc. They signify dedication, address, homage, imitation. In the recent poems of Fred Moten, we encounter “with,” a preposition that denotes accompaniment. The little difference makes a big difference, emphasizing collaboration over the economy of the gift, suggesting that the poet and his company are fellow travelers, in the same time zone, alongside each other in the present tense of composition. (Given Moten’s acclaimed critical work on jazz, the “with” is immediately evocative of musical performance, e.g., “Miles Davis with Sonny Rollins.”) Not all “withs” are the same—there is a different intimacy in the poem “fifty little springs,” which is “with aviva,” Moten’s wife’s Hebrew name (which means springtime), than there is in “resistances,” which is “with” a critic and an artist, interlocutors of Moten’s. (The poem “13. southern pear trees” has no preposition after the title, but is excerpted from another responding to the work of Zoe Leonard, and so is still a work of fellowship.) The scale of that “with” can be small (“with aviva, as if we were all alone”) or vast (“with everybody we don’t know”), but either way the poem becomes an instance of alongsidedness instead of belatedness; the poems request, with that subtle prepositional shift, that we think of ourselves as participants in the production of meaning and not mere recipients of someone else’s eloquence.

“Untitled,” 1989, by Zoe Leonard © Zoe Leonard (detail)

Estimated number of times in the Fall of 1990 that George Bush told a joke about his dog asking for a wine list with her Alpo:

10

French researchers reported that 52 percent of young women exposed to Francis Cabrel’s ballad “Je l’aime à mourir” gave their phone numbers to an average-looking young man who hit on them, whereas only 28 percent of those exposed to Vincent Delerm’s “L’heure du thé” did so.

Migrant children were teargassed; carbon dioxide levels have reached three to five million year high; missionary killed by remote tribe

Subscribe to the Weekly Review newsletter. Don’t worry, we won’t sell your email address!

HARPER’S FINEST

Happiness Is a Worn Gun

By

Illustration by Stan Fellows

Illustration by Stan Fellows

“Nowadays, most states let just about anybody who wants a concealed-handgun permit have one; in seventeen states, you don’t even have to be a resident. Nobody knows exactly how many Americans carry guns, because not all states release their numbers, and even if they did, not all permit holders carry all the time. But it’s safe to assume that as many as 6 million Americans are walking around with firearms under their clothes.”

Subscribe Today