Forum — From the February 2018 issue

The Minds of Others

The art of persuasion in the age of Trump

Download Pdf
Read Online
( 7 of 8 )

Deus ex Machina

By Kelly Clancy

E. M. Forster, in his 1909 short story “The Machine Stops,” imagines a world thousands of years in the future, where humans live in subterranean pods, fearing the air above and relying on an omniscient machine for their every need. The Machine’s user manual is their holy book, and in their prayers they celebrate its many blessings:

The Machine feeds us and clothes us and houses us; through it we speak to one another, through it we see one another, in it we have our being. The Machine is the friend of ideas and the enemy of superstition.

They are so familiar with the failings of the human mind that they no longer trust their senses, their own eyes and ears, and prefer to ask the Machine for answers rather than experience things firsthand.

Forster’s vision eerily anticipates the promise of artificial intelligence — that algorithms will save us from ourselves by making decisions rationally, obviating the need for human persuasion, which is so often based on emotion, prejudice, and manipulation. Recently, in Silicon Valley, Anthony Levandowski, a former engineer at Uber and Google, founded a nonprofit called Way of the Future that is working to develop just such a machine. Levandowski believes he can create “a Godhead based on artificial intelligence” — a manifestation of the divine in silico.

There are reasons to fear a future dictated by AI. In the New York Times in 2014, the writer Nick Bilton warned that a medical robot programmed to fight cancer could “conclude that the best way to obliterate cancer is to exterminate humans who are genetically prone to the disease.” That would indeed be scary, but in its current state AI is not the coldly logical force of reason that we might imagine. In fact, the present threat is that our models are too human, operating with the biases inherent in the data we feed them. They are often deeply flawed, and yet we are easily swayed by the information they supply, because it has the sheen of scientific objectivity.

The risks of our misplaced faith in AI have already manifested in the criminal justice system. For example, the software company Northpointe owns a proprietary algorithm called COMPAS, which is used by courts to score the risk of recidivism for parole candidates. Northpointe has refused to disclose how the algorithm works, but an assessment by ProPublica found that it is twice as likely to misclassify African Americans as high-risk compared with whites. (Northpointe has disputed this analysis.) In 2016, Eric Loomis, a man from Wisconsin who was sentenced to six years in prison in part because of his COMPAS score, challenged the algorithm in court, claiming that the way it was applied violated his right to due process. Although the judges acknowledged that “concern about ad hoc decision making is justified,” Loomis lost his case, and COMPAS remains in use across the nation.

The laws of evolution suggest that a generalized superintelligence such as Forster’s Machine may never exist. In the closing lines of On the Origin of Species, Charles Darwin writes of a “tangled bank, clothed with many plants of many kinds, with birds singing on the bushes . . . and with worms crawling through the damp earth.” Each evolved into a prodigy within its niche — this strain of bacteria is a virtuosic fermenter, that bat is a genius of the ultrasonic hunt — but excellence in one area comes at a cost. There are already indications that AI will be similarly constrained. In 1997, the mathematicians David Wolpert and William Macready developed the “no free lunch” theorem to describe the way algorithms “pay” for their high performance on certain problems by becoming worse at others. The future may therefore be a collection of specialized AIs: the lesser gods of loan assessment and weather prediction.

Forster’s story hints, however, that the greatest danger is not our ability to engineer a perfect intelligence but rather our belief in such a promise. We trust that there is order in the universe and that we can discern it, or at least that our tools might be able to.

This faith can be easily exploited. For centuries, the Catholic Church sought to limit the circulation of Bible translations in order to control access to the word of God. Mayan elites jealously guarded knowledge of hieroglyphics, insisting that they alone could mediate between the gods and common men. It is easier to persuade people when you can refer to a source of information beyond their comprehension; AI could become the new instantiation of that higher power.

Scholars such as Zeynep Tufekci have long argued that we cannot afford to trust the “black box” algorithms that are being developed by corporations in secrecy. Machine-learning software should be open-source — visible to anyone with a computer. But a more fundamental problem in democratizing AI is the general lack of tech literacy. Alan Kay, a professor of computer science at UCLA, is working to solve this problem by designing new coding languages that are simple and instinctive, appealing to our physical intuition of the real world. Even if computer science were made mandatory in public schools, however, there would not be enough experienced teachers to fill the classrooms. Coding skills are particularly rare among women, non-whites, and low-income Americans. Women account for only 18 percent of computer-science undergraduate degrees, and non-whites for only 20 percent.

This past fall, the journalist David Lewis infiltrated a white-nationalist convention in Seattle and found that most of its members were young men working in the tech industry. “You’re also a coder?” Lewis reported hearing the participants ask one another. “Do you mind if I send you something I’ve been working on?” Greg Johnson, the event’s organizer, shared with Lewis his plans to embed white-supremacist “secret agents” in Silicon Valley and corporate America. The agents would rise through the ranks by paying “lip service” to diversity and, once they had enough power, would start hiring other racists and excluding non-whites.

Lewis’s reporting shows how important it is that we know who is building our programs — that we understand who is trying to convince us, and of what. Without access to such knowledge, we will be forced to accept the decisions of our algorithms blindly, like Job accepting his punishment.

An inescapable feature of progress is that we must use new tools before we fully understand them. Language, humankind’s foundational technology, is one example of this. There were no linguists mapping grammar before the first infant babbled some nascent tongue. But the beauty of language is that it enables its own dissection: it is through language that we have come to understand the power of language. We must demand the same transparency from our algorithms. The current state of AI — given its ability to persuade, and given its tendency toward bias and opacity — threatens to entrench existing power structures. Instead, we must endeavor to build AI that can expose itself to us. There is a great danger in creating gods with whom only a few can communicate.

You are currently viewing this article as a guest. If you are a subscriber, please sign in. If you aren't, please subscribe below and get access to the entire Harper's archive for only $45.99/year. Or purchase this issue on your iOS or Android devices for $6.99.

= Subscribers only.
Sign in here.
Subscribe here.

Download Pdf
Share

More from David Bromwich:

Get access to 167 years of
Harper’s for only $45.99

United States Canada

THE CURRENT ISSUE

March 2018

The Great Divide

= Subscribers only.
Sign in here.
Subscribe here.

Nobody Knows

= Subscribers only.
Sign in here.
Subscribe here.

The Other Whisper Network

= Subscribers only.
Sign in here.
Subscribe here.

The Infinity of the Small

= Subscribers only.
Sign in here.
Subscribe here.

Empty Suits

= Subscribers only.
Sign in here.
Subscribe here.

view Table Content
Close

Sign up to receive The Weekly Review, Harper’s Magazine’s singular take on the past seven days of madness. It’s free!*

*Click “Unsubscribe” in the Weekly Review to stop receiving emails from Harper’s Magazine.