“Progress is impossible without change,” George Bernard Shaw wrote in 1944, “and those who cannot change their minds cannot change anything.” But progress through persuasion has never seemed harder to achieve. Political segregation has made many Americans inaccessible, even unimaginable, to those on the other side of the partisan divide. On the rare occasions when we do come face-to-face, it is not clear what we could say to change each other’s minds or reach a worthwhile compromise. Psychological research has shown that humans often fail to process facts that conflict with our preexisting worldviews. The stakes are simply too high: our self-worth and identity are entangled with our beliefs — and with those who share them. The weakness of logic as a tool of persuasion, combined with the urgency of the political moment, can be paralyzing.
Yet we know that people do change their minds. We are constantly molded by our environment and our culture, by the events of the world, by the gossip we hear and the books we read. In the essays that follow, seven writers explore the ways that persuasion operates in our lives, from the intimate to the far-reaching. Some consider the ethics and mechanics of persuasion itself — in religion, politics, and foreign policy — and others turn their attention to the channels through which it acts, such as music, protest, and technology. How, they ask, can we persuade others to join our cause or see things the way we do? And when it comes to our own openness to change, how do we decide when to compromise and when to resist?
contributors
Hanif Abdurraqib lives in Columbus, Ohio. His first collection of essays, They Can’t Kill Us Until They Kill Us, was published last year by Two Dollar Radio.
David Bromwich is the Sterling Professor of English at Yale University. His most recent essay for Harper’s Magazine, “What Went Wrong,” appeared in the June 2015 issue.
Kelly Clancy is a neuroscientist and a postdoctoral fellow at the University of Basel, Switzerland.
Garth Greenwell’s debut novel, What Belongs to You, was published in 2016 by Picador. He lives in Iowa City, Iowa.
Laila Lalami is a professor of creative writing at the University of California, Riverside. Her most recent novel is The Moor’s Account (Pantheon).
T. M. Luhrmann is the Watkins University Professor of Anthropology at Stanford. Her article “Blinded by the Right?” appeared in the April 2013 issue of Harper’s Magazine.
Mychal Denzel Smith’s memoir, Invisible Man, Got the Whole World Watching, was published in 2016 by Nation Books.
Worlds Apart
By T. M. Luhrmann
In March 1997, the bodies of thirty-nine people were discovered in a mansion outside San Diego. They were found lying in bunk beds, wearing identical black shirts and sweatpants. Their faces were covered with squares of purple cloth, and each of their pockets held exactly five dollars and seventy-five cents.
The police determined that the deceased were members of Heaven’s Gate, a local cult, and that they had intentionally overdosed on barbiturates. Marshall Applewhite, the group’s leader, had believed that there was a UFO trailing in the wake of Comet Hale-Bopp, which was visible in the sky over California that year. He and his followers took the pills, mixing them with applesauce and washing them down with vodka, in order to beam up to the spacecraft and enter the “evolutionary level above human.”
In the aftermath of the mass suicide, one question was asked again and again: How could so many people have believed something so obviously wrong?
I am an anthropologist of religion. I did my first stint of fieldwork with middle-class Londoners who identified as witches, druids, and initiates of the Western Mysteries. My next project was in Mumbai, India, where Zoroastrianism was experiencing a resurgence. Later, I spent four years with charismatic evangelical Christians in Chicago and San Francisco, observing how they developed an “intimate relationship” with an invisible God. Along the way, I studied newly Orthodox Jews, social-justice Catholics, Anglo-Cuban Santeria devotees, and, briefly, a group in southern California that worshipped a US-born guru named Kalindi.
Most of these people would describe themselves as believers. Many of the evangelicals would say that they believe in God without doubt. But even the most devout do not behave as if God’s reality is the same as the obdurate thereness of rocks and trees. They will tell you that God is capable of anything, aware of everything, and always on their side. But no one prays that God will write their term paper or replace a leaky pipe.
Instead, what their actions suggest is that maintaining a sense of God’s realness is hard. Evangelicals talk constantly about what bad Christians they are. They say that they go to church and resolve to be Christlike and then yell at their kids on the way home. The Bible may assert vigorously the reality of a mighty God, but psalm after psalm laments his absence. “My God, my God, why hast thou forsaken me?”
Beliefs are not passively held; they are actively constructed. Even when people believe in God, he must be made real for them again and again. They must be convinced that there is an invisible other who cares for them and whose actions affect their lives.
This is more likely to happen for someone who can vividly imagine that invisible other. In the late 1970s, Robert Silvey, an audience researcher at the BBC, started using the word “paracosm” to describe the private worlds that children create, like the North Pacific island of Gondal that Emily and Anne Brontë dreamed up when they were girls. But paracosms are not unique to children. Besotted J.R.R. Tolkien fans, for example, have a similar relationship with Middle-earth. What defines a paracosm is its specificity of detail: it is the smell of the rabbit cooked in the shadow of the dark tower or the unease the hobbits feel on the high platforms at Lothlórien. In returning again and again to the books, a reader creates a history with this enchanted world that can become as layered as her memory of middle school.
God becomes more real for people who turn their faith into a paracosm. The institution provides the stories — the wounds of Christ on the cross, the serpent in the Garden of Eden — and some followers begin to live within them. These narratives can grip the imagination so fiercely that the world just seems less good without them.
During my fieldwork, I saw that people could train themselves to feel God’s presence. They anchored God to their minds and bodies so that everyday experiences became evidence of his realness. They got goose bumps in the presence of the Holy Spirit, or sensed Demeter when a chill ran up their spine. When an idea popped into their minds, it was God speaking, not a stray thought of their own. Some people told me that they came to recognize God’s voice the way they recognized their mother’s voice on the phone. As God became more responsive, the biblical narratives seemed less like fairy tales and more like stories they’d heard from a friend, or even memories of their own.
Faith is the process of creating an inner world and making it real through constant effort. But most believers are able to hold the faith world — the world as it should be — in tension with the world as it is. When the engine fails, Christians might pray to God for a miracle, but most also call a mechanic.
Being socially isolated can compromise one’s ability to distinguish his or her paracosm from the everyday world. Members of Heaven’s Gate never left their houses alone. They wore uniforms and rejected signs of individuality. Some of them even underwent castration in order to avoid romantic attachments. When group members cannot interact with outsiders, they are less likely to think independently. Especially if there is an autocratic leader, there is less opportunity for dissent, and the group becomes dependent on his or her moral authority. Slowly, a view of the world that seems askew to others can settle into place.
When we argue about politics, we may think we are arguing over facts and propositions. But in many ways we argue because we live in different paracosmic worlds, facilitated these days by the intensely detailed imaginings of talk radio and cable news. For some of us, that world is the desperate future of the near at hand. If abortion is made illegal, abortions will happen anyway, and women will die because they used clothes hangers to scrape out their insides. Others live in a paracosm of a distant future of the world as it should be, where affirmative action is unnecessary because people who work hard can succeed regardless of where they started.
Recently, the dominant political narratives in America have moved so far apart that each is unreadable to the other side. But we know that the first step in loosening the grip of an extreme culture is developing a relationship with someone who interprets the world differently. In 2012, for example, a woman named Megan Phelps-Roper left the Westboro Baptist Church, a hard-line Christian group that pickets the funerals of queer people, after she became friendly with a few of her critics on Twitter. If the presence of people with whom we disagree helps us to maintain common sense, then perhaps the first step to easing the polarization that grips this country is to seek those people out. That’s the anthropological way.
Hearts and Minds
By Laila Lalami
Last summer, in Afghanistan’s Parwan province, the United States air-dropped leaflets depicting a lion chasing a white dog. “Get your freedom from these terrorist dogs,” the text read in Pashto. “Help the coalition forces find these terrorists and eliminate them.” Because Afghanistan has a literacy rate of just 38 percent, symbols are often the most effective way to convey a message. To make their meaning clear, American military officials chose a dog, considered unclean in some Islamic traditions, to represent Taliban insurgents. The flag of the Taliban is white, so the dog on the leaflet is also white. And since the Taliban’s flag bears the shahada, the Islamic declaration of faith, the army printed it on the body of the dog as well. But the shahada is shared by Muslims regardless of country, culture, or tradition, which meant that the United States had inadvertently portrayed itself, on its own leaflets, as a predator of all Muslims.
The fallout was swift and unsurprising. A Taliban suicide bomber blew himself up outside Bagram Air Base in retaliation. Afghans found the flyer offensive, and the governor of the province lodged a complaint with US commanders. They issued an apology, but the damage had been done. “It is a major abuse against Islam,” Mohammad Zaman Mamozai, the police chief of Parwan, told reporters. “Why do they not understand or know our culture, our religion and history?”
Mamozai had a point. America’s mission in Afghanistan is, in part, to “win hearts and minds,” but it would appear that the work of understanding is to be done by the people who are being occupied, without much effort on the part of the occupier. This asymmetry seems jarring to Afghans but not necessarily to Americans, who too often take it for granted that their country is a force for good.
America first developed the strategy of winning hearts and minds (also known by its unfortunate acronym, WHAM) during the Vietnam War. President Kennedy believed that in order to defeat an insurgency, it was not enough to overpower the enemy on the battlefield; civilian populations also had to be persuaded to join the American side, through social, cultural, or economic projects. One of these was the Strategic Hamlet Program, which relocated Vietnamese farmers to fortified villages where they could receive basic services and protection from insurgent attacks. But the move created resentment among farmers who lost their ancestral lands, and the villages often fell to Vietcong forces anyway. Nevertheless, President Johnson, Kennedy’s successor, continued to believe that WHAM was the key to winning in Vietnam. “The ultimate victory,” he said in 1965, “will depend upon the hearts and the minds of the people who actually live out there.”
When I was growing up in Morocco, my encounters with America were primarily cultural. I watched Little House on the Prairie on television, pined for Michael Jackson’s red leather jacket, and copied Prince’s dance moves. I practiced English pronunciation by listening to Madonna. (“Muh-teer-ee-uhl girl,” not “Mah-teer-ee-uhl girl,” etc.) In college, I studied American history and the US Constitution as part of my major in English, and often used the library at Dar America, an embassy-sponsored cultural center in Rabat. For a few months in my twenties, I even worked as an education liaison for Amideast, an American nonprofit.
But familiarity with America’s culture and use of its educational resources did not translate for me into an endorsement of its foreign policy, which is why I have always found the concept of winning hearts and minds to be absurd. The minds of others are not empty vessels into which support for America can be poured. Persuading a society to abandon one form of government for another requires dialogue, and people are unlikely to listen to what you have to say if you’re also dropping bombs on them.
None of this means that there is no room for persuasion in international confrontations. The nuclear deal with Iran and the détente with Cuba, for example, were the outcome of sustained diplomatic work. But these efforts were successful because they brought both sides to the negotiating table, whereas WHAM relies on unilateral communication backed by violence. The fundamental flaw of WHAM is that it proceeds from the naïve belief that to know America is to love it, and thereafter accept its actions, even when they endanger one’s own survival. That WHAM has been an unquestioned part of US foreign policy for decades illustrates America’s peculiar insistence on its own innocence.
In spite of WHAM’s documented failure to quell insurgencies, George W. Bush decided to use the strategy when he invaded Afghanistan and Iraq. “Across the world, hearts and minds are opening to the message of human liberty as never before,” he told the UN General Assembly in 2005, citing as proof the two countries’ recent elections. To help shape opinion abroad, the Bush Administration launched a satellite TV channel that broadcast daily news, talk shows, and cultural programming across the Middle East and North Africa. They named it Alhurra (Arabic for “the free”). But just as Johnson failed to win over villagers whose homes were being torched with napalm, so too did Bush struggle to convince the fabled “Arab street” of his good intentions. Alhurra, which is based in Virginia and receives its funding from Congress, could not compete with the hundreds of local and satellite channels in the Arab world, the most formidable of which, Al Jazeera, aggressively covered the Iraq War, including the atrocities at Abu Ghraib prison and the massacre of civilians in Haditha. Alhurra continues to broadcast throughout the Middle East, but it is seen for what it is: propaganda by an occupying power.
As the American war in Afghanistan enters its seventeenth year, “winning hearts and minds” has become little more than a painful cliché. Soon it may be discarded altogether. “I studied Afghanistan in great detail and from every conceivable angle,” President Trump declared in August, with no detectable irony. Earlier in the summer, the Pentagon had announced plans to deploy 3,900 more troops to the country. But this time there would be no effort to win over the local population. “We are not nation-building again,” Trump said. “We are killing terrorists.” He also gave James Mattis, his secretary of defense, the authority to loosen rules of engagement on the battlefield, a move that may result in an increase in Afghan civilian casualties.
Although Americans have been divided over the war in Afghanistan, previous presidents could rally support by saying that soldiers were building roads or schools “out there.” But by bluntly stating that the United States is not interested in helping the people whose lands it occupies, Trump has removed the mask of compassion. Millions of quiet Americans may soon find that they do not like the brutal face beneath the mask. As shameful as it is, this admission may be the best chance we have of ending the war.
Redemption Song
By Hanif Abdurraqib
In 1975, Bob Dylan went to visit the boxer Rubin “Hurricane” Carter in a New Jersey prison. Carter had been convicted of a triple murder at a bar nine years before, but he continued to maintain his innocence, arguing that he had been the victim of a racially motivated prosecution. No weapons or fingerprints had been found linking him to the crime, yet an all-white jury had declared him guilty after less than a day of deliberation.
Dylan left the prison moved to write a song. In “Hurricane,” released on his 1976 album Desire, he narrates Carter’s tale of inequity in eleven linear verses:
How can the life of such a man
be in the palm of some fool’s hand?
To see him obviously framed
couldn’t help but make me feel ashamed to live in a land
where justice is a game.
The final cut of “Hurricane” is more than eight minutes long. It was one of Dylan’s most successful singles of the Seventies, charting at thirty-three on Billboard. The song helped raise awareness about Carter’s case and brought him a wave of popular support. A decade later, a judge overturned Carter’s conviction, stating that it had been based on “an appeal made to racism rather than reason.”
I grew up on protest music and my parents grew up on protest music and I imagine their parents might have, too. A protest song is a song that keeps people alive in the face of a system that wants them dead. It can be an educational tool or a rallying cry but is most powerful when it is both — when it not only informs the public but pushes them to action, the way “Hurricane” did. Songs of protest do not always persuade their audiences, but they present an opportunity for people to change their minds. As they are replayed on the radio or sung in homes, their messages creep into the lives of their listeners.
Protest songs have always played a crucial part in political movements. In the Eighties there was U2’s “Sunday Bloody Sunday,” and then N.W.A.’s “Fuck tha Police”:
Police think
they have the authority to kill a minority.
Fuck that shit, ’cause I ain’t the one
for a punk motherfucker with a badge and a gun.
But the protest music that exists now, some of it explicitly anti-Trump — YG’s on-the-nose “FDT (Fuck Donald Trump),” Death Cab for Cutie’s “Million Dollar Loan” — is being released into a more deeply polarized society. Artists risk alienating a precious part of their fan base if they take a stand against something in a song, and for many, now that they have to scrape by on the pennies they earn from streaming services, the stakes are simply too high.
So they offer us pleasantries instead. They argue that everything is political because even moments of joy and lightness can be wrested away — by violence, by the state, by the fear of something worse on the horizon. In this way, they say, the act of holding on to that joy is radical. The poet writes of flowers, and this is political because the landscape might soon be destroyed. The pop star sings of love, and this is political because others are advocating hatred. In such a world, it is easy for an artist to be celebrated for even the most mundane sentiment.
I find myself, now, turning to protest songs from the past. The first time I heard Public Enemy’s “Fight the Power,” it was blaring from Radio Raheem’s boom box in Spike Lee’s 1989 film Do the Right Thing: “Our freedom of speech is freedom or death / We got to fight the powers that be.” The song was a source of power and comfort to Raheem, who played it on repeat as he walked through the neighborhood.
I am also a young black man who is attached to music at all times. I think about what it meant for me to see Raheem’s boom box smashed by Sal, the white owner of the pizzeria — to watch Raheem fly into a rage and be choked to death by policemen as the speakers coughed out a few dying notes of the song. I wonder now if the protest song is also a receptacle for our rage, something that calms us because it channels the anger that we’re holding inside. What do we do with that rage when the music stops?
At the end of a protest against racism at Yale in November 2015, someone played Kendrick Lamar’s “Alright” on a stereo. The day was unseasonably warm, and we had been marching for nearly an hour, filling the streets of downtown New Haven. It is difficult to articulate the toll this type of movement takes on the body. To march, and chant, and balance on the edge of rage, sadness, and fear is draining in ways that few other things are.
Like most songs today, “Alright” isn’t explicitly political, but it is often played at protests as a kind of short, joyful break. Once the song rolled into its chorus — with the iconic chant of “We gon’ be alright” — one person yelled, and then another. The chorus reached its climax, and then the dancing began. I remember feeling an overwhelming sense of relief as the song was played. It felt as if the rigor of the day were leaving my body, note by note. As I danced, it occurred to me that contemporary music may have a different function in protest altogether. Perhaps today’s protest song is not one that changes people’s minds but one that sustains people who have already been convinced: a song that aims not to take down the powerful but to lift up the weary — those who may be looking for a way to keep going when all seems lost.
Return to Reason
By David Bromwich
Marx wrote in his Theses on Feuerbach: “The philosophers have only interpreted the world in various ways; the point is to change it.” He was speaking as a revolutionary who wanted his readers to pass from metaphysics to action. Persuasion, as we commonly view it, recognizes no such dichotomy: it wants both to interpret and to change the world. It may be defined as the process of making people think and act as they did not before; yet it has to work in the dense medium of existing beliefs.
At the far reach of belief — belief at its most tenacious — lies conviction. The Victorian political writer Walter Bagehot devoted a remarkable essay, “On the Emotion of Conviction,” to the state of mind that word denotes:
A hot flash seems to burn across the brain. Men in these intense states of mind have altered all history, changed for better or worse the creed of myriads, and desolated or redeemed provinces and ages. Nor is this intensity a sign of truth, for it is precisely strongest in those points in which men differ most from each other.
On a continuum with the hot flash of conviction are all the milder gradations of belief: leaning, tendency, inclination. Bagehot took care to notice that conviction, though it seems inextricable from our nature, is neither good nor bad in itself. We should not suppose that the best society imaginable is the one containing the largest number of persons of unshakable conviction.
Politicians seek to plant convictions in their constituents for various reasons, some of them reckless or self-serving. They stir up anger at a useless sacrifice, or resentment at an empty promise. Yet to admit that politics is a realm of emotions is not to concede that its techniques are necessarily irrational. Emotions have their own cognitive work to perform, and in ordinary life they often cooperate with reasonable thinking. A demagogue is defined less by the content of his appeal than by the hectic speed and intensity with which he exploits ready-made feelings. The most dangerous conservative demagogue is one who blames a vulnerable group for a reversal of fortune whose actual causes are complex. The most dangerous liberal is one who angrily invokes the sufferings of the oppressed while doing little to combat the forces of oppression.
If, without demagoguery, you want to bring national sentiment to your side, you must first demonstrate that you respect the citizens of the nation. The road to honest persuasion passes through conciliation, and this requires an initial presumption of goodwill. Opponents must be recognized as people who are not vicious by nature, even as they are warned candidly about the wrongs they may commit. Lincoln, in his speech on the Kansas-Nebraska Act of 1854, confessed that he hated slavery for the “monstrous injustice” of the institution, but the same speech allowed that the Southern people “are just what we would be in their situation. If slavery did not now exist amongst them, they would not introduce it. If it did now exist amongst us, we should not instantly give it up.”
The trust of democracy is that if you give reasons honestly and your reasons pass the test of serving the public good, the voters will master your argument and come to approve it. But that depends on an initial trust that persuasion can sometimes occur, a thing that is surprisingly hard to prove. We know that people come to think in ways they had never thought before, but the cause of such a shift may be a religious conversion, an intense friendship, anything rather than a conscious decision arrived at through argument. All we can say is that something has changed in someone’s beliefs; how it happened remains mysterious to a degree. Still, a generous trust in the efficacy of persuasion is essential to the social contract in a system of self-government. The anxiety that many Americans feel today — it is a detectable irritant in half the op-eds published in the New York Times, the Washington Post, and the Wall Street Journal — comes from a suspicion that we have lost our trust in the possibility of persuasion. Have Americans stopped trying to debate one another?
Trust in persuasion seems hardly separable from the necessary minimum of trust in government. What is the Constitution if not a system for managing disagreement and convergence, a system that leaves open the possibility of conflict without everlasting bitterness and reproach? Between the Forties and the Sixties, most Americans did apparently share such a trust. Their government had saved them from the worst effects of the Great Depression (which they could witness in Europe), led the country to victory in a war most believed was necessary, created systems of social insurance and interstate highways, and made sure that clean and inexpensive housing was available for millions. Government was a major source of the belief that American society was improving, that one’s children had a decent chance to be better off than oneself.
Since the Seventies, by contrast, distrust of government — accompanied by a loss of belief in the persuasive methods by which government must gain legitimacy — has become a common, if not the dominant, national sentiment. A major cause of the change was the personality of Ronald Reagan, who disparaged the federal government he ran while claiming descent from the heroic age of the New Deal and the Second World War. Did he say that ceding control of the Panama Canal was a betrayal of US sovereignty and a threat to our vital interests? He seemed a politician of such transparent simplicity that he could not be saying these things from ideological narrowness or obliviousness. The force of his personality carried the weight of an argument that was otherwise held together by spurious anecdotes and half-baked theories. And yet his evocation of a “shining city on a hill” answered sufficiently to people’s desire to restore the promise of a better time.
People naturally look to their self-interest, but most Americans want to be part of something bigger, and politicians are aware of this fact. The classic response of the liberal is to broaden our sense of community beyond tribe or sect or social class; for the conservative, it is to strengthen our allegiance to a common past. But neither tendency can altogether exclude the other. The penalty for doing so is total inertia at one extreme and totalitarian innovation at the other. In the Clinton and Obama years, the way of community was called globalization, something people were not sure they wanted, even if they could be sure what it meant. Meanwhile, the shining city on a hill had been emptied of practical meaning long before the conservative movement of the Eighties was swallowed up in the flailing nationalism of Trump.
Today, we are living at a hollow moment, a protracted and uncertain interregnum in which citizens are searching for leaders and both major parties are groping for a principle they can stand by. Any responsible political power that emerges — a power capable of imparting conviction to people who think as citizens — will speak a language that does not take passion for cogency, or enthusiasm for proof.
No Justice, No Peace
By Mychal Denzel Smith
In 1829, David Walker, a free black man and abolitionist in Boston, published Appeal to the Coloured Citizens of the World, a pamphlet that called for enslaved blacks to rise up violently against the institution of slavery. Walker enlisted black seamen to distribute the pamphlet, and within a year it was being read across the South. The Appeal was considered so dangerous that, in Georgia, a $10,000 bounty was placed on Walker’s head, and the mayor of Savannah asked his counterpart in Boston to prevent the publication of a third edition. They were unsuccessful, and Walker even added a new introduction denouncing the hypocrisy of those white Americans who professed to be Christlike while tormenting their fellow citizens. “All I ask,” he wrote,
is that the world may see that we, the Blacks or Coloured People, are treated more cruel by the white Christians of America, than devils themselves ever treated a set of men, women and children on this earth.
And Walker went further, reprinting the opening paragraph of the Declaration of Independence and writing, “See your Declaration Americans!!! Do you understand your own language?” In doing so, he created a compelling template for black political protest in the United States. The invocation of American ideals, particularly those set forth in the country’s founding documents, has been a tactic used by black intellectuals and activists from Anna Julia Cooper to Martin Luther King Jr., who made music from the words “all men are created equal.” They spoke of black liberation in American terms — asking their countrymen not to change their fundamental principles but simply to start living by them.
Colin Kaepernick’s protest continues this tradition. During the 2016 NFL preseason, Kaepernick, who was then the quarterback for the San Francisco 49ers, took to sitting on the bench and, later, kneeling on the field while the national anthem was played. Events of the past few years, in particular the killing of hundreds of unarmed black people by the police, had caused a political awakening in Kaepernick. When asked why he would not rise for the anthem, he replied, “I am not going to stand up to show pride in a flag for a country that oppresses black people and people of color.”
Consciously or not, Kaepernick was reprising the argument made by Ralph Ellison in his 1953 essay “Twentieth-Century Fiction and the Black Mask of Humanity”:
By excluding our largest minority from the democratic process the United States weakened all national symbols and rendered sweeping public rituals which would dramatize the American dream impossible.
And by kneeling during the anthem, Kaepernick exposed once again the tension between those symbols and the black experience of state violence.
The American delusion of equality, however, prevents us from seeing this unsettling truth. Despite empirical evidence of persistent racism within education, housing, criminal justice, government — within every institution in this country — Americans choose to ignore its continued influence. Anyone who dares to point it out faces a penalty: Kaepernick was cast as a spoiled, millionaire athlete with no real grievances, an ungrateful civilian who abused the flag that American troops had died to protect. Then he lost his job.
Sports pundits often say that before the next draft, Kaepernick needs to convince one of the thirty-two NFL teams that he is committed to football — that he would not be a “distraction” on the field. I have not heard anyone suggest that the United States needs to convince Kaepernick that his protest has been seriously considered. In America, it is the job of the powerless to show the powerful that their moral authority is an illusion. Labor must persuade management to value its work; women must persuade men to stop their abusive behavior. Black people have spent our entire history attempting to convince white people — first of our humanity, then of our right to revel in it fully.
In September, President Trump tweeted that the few NFL players kneeling in solidarity with Kaepernick should be fired. With a single tweet, he lent the voice of government to those opposing the protests, not unlike the mayor of Savannah who sought to halt the publication of Walker’s Appeal. “The very support which they draw from government,” Walker wrote, “aids [slave owners] in perpetrating such enormities. . . . And yet they are calling for Peace! — Peace!! Will any peace be given unto them?”
There are calls for peace and solidarity almost as soon as a protest begins. Order is valued over justice, and pressure mounts for protesters to be reasonable, to organize their protest in a way that does not shock white people or cause them to recoil. But solidarity is impossible under unequal conditions. Peace is not the concern of the agitator. There is a difference between marching in an orderly fashion along a predetermined route with a lawfully obtained permit and chaining your body to the doors of a police precinct. Each may serve the struggle in some way, but only one is a direct attack on power.
Kaepernick’s protest was not Walker’s violent uprising. But it was far bolder than the facile act of standing together with locked arms — a compromise gesture adopted by some NFL players to support him while maintaining their good standing among fans and owners.
The proper role of protest is to dramatize the unequal distribution of power. What protests are not charged with is upholding reverence for the institutions that make them necessary. A brutal system of police, prosecutors, and politicians has rendered American symbols meaningless, and the onus is on the US government to restore their meaning — to convince the marchers and kneelers and petitioners and organizers of its commitment to progress. We achieve peace not by demanding that those who expose our contradictions be silent but by pressuring the powerful to convince the rest of us that there is no reason to shout.
Deus ex Machina
By Kelly Clancy
E. M. Forster, in his 1909 short story “The Machine Stops,” imagines a world thousands of years in the future, where humans live in subterranean pods, fearing the air above and relying on an omniscient machine for their every need. The Machine’s user manual is their holy book, and in their prayers they celebrate its many blessings:
The Machine feeds us and clothes us and houses us; through it we speak to one another, through it we see one another, in it we have our being. The Machine is the friend of ideas and the enemy of superstition.
They are so familiar with the failings of the human mind that they no longer trust their senses, their own eyes and ears, and prefer to ask the Machine for answers rather than experience things firsthand.
Forster’s vision eerily anticipates the promise of artificial intelligence — that algorithms will save us from ourselves by making decisions rationally, obviating the need for human persuasion, which is so often based on emotion, prejudice, and manipulation. Recently, in Silicon Valley, Anthony Levandowski, a former engineer at Uber and Google, founded a nonprofit called Way of the Future that is working to develop just such a machine. Levandowski believes he can create “a Godhead based on artificial intelligence” — a manifestation of the divine in silico.
There are reasons to fear a future dictated by AI. In the New York Times in 2014, the writer Nick Bilton warned that a medical robot programmed to fight cancer could “conclude that the best way to obliterate cancer is to exterminate humans who are genetically prone to the disease.” That would indeed be scary, but in its current state AI is not the coldly logical force of reason that we might imagine. In fact, the present threat is that our models are too human, operating with the biases inherent in the data we feed them. They are often deeply flawed, and yet we are easily swayed by the information they supply, because it has the sheen of scientific objectivity.
The risks of our misplaced faith in AI have already manifested in the criminal justice system. For example, the software company Northpointe owns a proprietary algorithm called COMPAS, which is used by courts to score the risk of recidivism for parole candidates. Northpointe has refused to disclose how the algorithm works, but an assessment by ProPublica found that it is twice as likely to misclassify African Americans as high-risk compared with whites. (Northpointe has disputed this analysis.) In 2016, Eric Loomis, a man from Wisconsin who was sentenced to six years in prison in part because of his COMPAS score, challenged the algorithm in court, claiming that the way it was applied violated his right to due process. Although the judges acknowledged that “concern about ad hoc decision making is justified,” Loomis lost his case, and COMPAS remains in use across the nation.
The laws of evolution suggest that a generalized superintelligence such as Forster’s Machine may never exist. In the closing lines of On the Origin of Species, Charles Darwin writes of a “tangled bank, clothed with many plants of many kinds, with birds singing on the bushes . . . and with worms crawling through the damp earth.” Each evolved into a prodigy within its niche — this strain of bacteria is a virtuosic fermenter, that bat is a genius of the ultrasonic hunt — but excellence in one area comes at a cost. There are already indications that AI will be similarly constrained. In 1997, the mathematicians David Wolpert and William Macready developed the “no free lunch” theorem to describe the way algorithms “pay” for their high performance on certain problems by becoming worse at others. The future may therefore be a collection of specialized AIs: the lesser gods of loan assessment and weather prediction.
Forster’s story hints, however, that the greatest danger is not our ability to engineer a perfect intelligence but rather our belief in such a promise. We trust that there is order in the universe and that we can discern it, or at least that our tools might be able to.
This faith can be easily exploited. For centuries, the Catholic Church sought to limit the circulation of Bible translations in order to control access to the word of God. Mayan elites jealously guarded knowledge of hieroglyphics, insisting that they alone could mediate between the gods and common men. It is easier to persuade people when you can refer to a source of information beyond their comprehension; AI could become the new instantiation of that higher power.
Scholars such as Zeynep Tufekci have long argued that we cannot afford to trust the “black box” algorithms that are being developed by corporations in secrecy. Machine-learning software should be open-source — visible to anyone with a computer. But a more fundamental problem in democratizing AI is the general lack of tech literacy. Alan Kay, a professor of computer science at UCLA, is working to solve this problem by designing new coding languages that are simple and instinctive, appealing to our physical intuition of the real world. Even if computer science were made mandatory in public schools, however, there would not be enough experienced teachers to fill the classrooms. Coding skills are particularly rare among women, non-whites, and low-income Americans. Women account for only 18 percent of computer-science undergraduate degrees, and non-whites for only 20 percent.
This past fall, the journalist David Lewis infiltrated a white-nationalist convention in Seattle and found that most of its members were young men working in the tech industry. “You’re also a coder?” Lewis reported hearing the participants ask one another. “Do you mind if I send you something I’ve been working on?” Greg Johnson, the event’s organizer, shared with Lewis his plans to embed white-supremacist “secret agents” in Silicon Valley and corporate America. The agents would rise through the ranks by paying “lip service” to diversity and, once they had enough power, would start hiring other racists and excluding non-whites.
Lewis’s reporting shows how important it is that we know who is building our programs — that we understand who is trying to convince us, and of what. Without access to such knowledge, we will be forced to accept the decisions of our algorithms blindly, like Job accepting his punishment.
An inescapable feature of progress is that we must use new tools before we fully understand them. Language, humankind’s foundational technology, is one example of this. There were no linguists mapping grammar before the first infant babbled some nascent tongue. But the beauty of language is that it enables its own dissection: it is through language that we have come to understand the power of language. We must demand the same transparency from our algorithms. The current state of AI — given its ability to persuade, and given its tendency toward bias and opacity — threatens to entrench existing power structures. Instead, we must endeavor to build AI that can expose itself to us. There is a great danger in creating gods with whom only a few can communicate.
Sons and Lovers
By Garth Greenwell
A few months ago my mother visited me in Iowa, arriving from her home in Elizabethtown, Kentucky, just in time for a literary event I was hosting. And so I watched the collision of two worlds I have strenuously kept apart: the world of my childhood, which I fled when I was sixteen, and the world of my adult, professional life.
My mother did what Southern mothers do, especially when they’ve had wine with dinner. She talked to everyone she could, making sure they knew that she was my mother, that she had raised me through great travail, that there had been hard years, terribly hard, but that now — oh, she said to my half-bewildered, half-charmed friends, she was so proud. It was lovely and mortifying and, from the distance I tried to keep while it was happening, more than a little comical.
But she didn’t just make sure that everyone knew I was her son; she also wanted everyone to know her connection to Luis, my boyfriend, whom she declared (inaccurately, sweetly) her son-in-law. This was more fun for me to watch; it was also astonishing, proof of the long journey that my conservative, Christian mother has made in her feelings about gay people.
Always remember, I tell my students when they ask for advice about coming out, your parents’ first reaction won’t be their last. It was lifesaving luck that my mother didn’t reject me for being gay: she took me in when my father, who divorced her when I was five, kicked me out. But she did make clear, in countless ways, that she considered me a source of shame.
I’ve never forgotten the only time, before Luis, that I brought a boyfriend home. I was twenty, and we were in Louisville for my oldest sister’s wedding. At the reception, when I reached for this boyfriend’s hand, my mother was suddenly between us, snatching our hands in hers and forcing us apart. She had been watching us the entire evening, ready to intervene. I realize now that she must have been terrified. This is your sister’s day, she hissed at me, the implication being that any display of affection between my boyfriend and me would have ruined it.
Prejudice depends on a kind of categorical thinking — though thinking may be the wrong word — that resists argument. The knowledge that combats this kind of prejudice isn’t rational but experiential: it depends on the attention we pay not to the abstract but to the particular and individual.
I saw this repeatedly during the four years I taught high school in Sofia, Bulgaria, where I was the only openly gay person most of my students had ever met, and where homophobia often flared up in pranks or threats, some of them painful. I still remember the shock I felt when, at the end of my first year, I saw a Facebook post from a just-graduated senior I had never taught, which was full of rage at the presence of a gay teacher in his school. Smurt na pedalite, he wrote: “Death to all faggots.”
Early in my career as a teacher, I sometimes debated my students about the value of LGBT lives, conversations that were exhausting and almost never productive. It was more convincing, I realized, to demonstrate that value in the texts I chose and in the way I spoke about my own life. It’s difficult for ideas about a whole group of people to survive concrete, daily acquaintance with an actual human being. This may be especially true in a high school literature classroom, where ideas are exchanged and stories told, and where students are young enough not to have lost their aptitude for change.
I know that not all my students changed their minds about queer people; I also know that some of them did. One day at the end of the school year, a sophomore, P, stopped by my classroom. P was very smart and very surly; he made an immediate impression. He was short for his age, and maybe as compensation he spent his afternoons in the gym; he was built, I remember thinking, like a little macho tank. Students often visited their teachers after final exams to say goodbye and to thank us for the year. But it was clear P wanted to say more. In Bulgaria, he said, looking away from me, lots of us have stupid ideas about gay people. I wanted to tell you that I’m sorry for that. Now I don’t believe those things anymore, he went on, and I wanted to thank you for that.
It’s easy to forgive a sixteen-year-old for having held a wrong belief; it’s harder to forgive an adult, and maybe especially hard to forgive a parent. But there was a corollary to the advice I gave my students about coming out: Do whatever you need to do to protect yourself from your parents’ first response. And then, as their response changes, try to be open to the possibility of forgiving them.
I struggle to follow this advice. I haven’t spoken with my father in more than a decade. My relationship with my mother, too, even though she always made clear to me that her love was more important than her shame, has sometimes been difficult. I’ve kept a great deal of my life from her, protecting it from the pain caused by moments like the one at my sister’s wedding, when she showed me that my relationships, and therefore my life, didn’t have the same value for her as the relationships and lives of my siblings.
That pain, like the prejudice that causes it, is resistant to rational or moral persuasion, and I don’t think I realized how much I was holding on to it until I felt the relief of its lifting. On the morning after the event that I hosted, my mother came to the house I’ve shared with Luis for the past two years. It was her first time visiting us there, and the sight of the home we’d made together, of the order Luis had brought to my habitual chaos, moved her to tears. After breakfast, as she was hugging Luis goodbye, I heard her thank him for loving her son.
Es estupenda, Luis said to me later that night: “Your mother is wonderful.” For decades I would have responded to this with ambivalence, with some deflection or qualification. Maybe I’ll feel the need for that again, but I didn’t feel it then. Yes, I said simply, wondering a little at the ease with which I said it: yes, she is.