Conversation — October 26, 2016, 8:00 am

Eating Right

“I think that the metaphor of seeing ethics in terms of a supermarket array of consumption decisions is all too pervasive in contemporary society,” says philosopher Paul B. Thompson

Healthy is no longer enough. How we evaluate food must now consider justice for small-scale agricultural laborers, the impact of industrial farming practices on animals, and the downstream effects of genetic modification. Understood within what is now a global obesity epidemic, such concerns have prompted a renewed interest in food ethics. But eating is not just about consumption. How, when, and, of course, what we eat engage questions that are as much about culture and tradition as they are calories. In his most recent book, From Field to Fork, published in 2015 by Oxford University Press, philosopher Paul B. Thompson investigates how we build food systems in developed economies and the complicated social and moral questions that surround how we produce, distribute, and consume our food. Thompson holds the W.K. Kellogg Chair in Agricultural Food and Community Ethics at Michigan State University. A leading voice in agricultural ethics for more than three decades, Thompson incorporates his work as an applied philosopher into his frequent collaborations with farmers, scientists, and environmentalists. In our conversation, we explore the consequences that accompany everyday choices about what we eat.

From Field to Fork carves out space for philosophy in a national conversation about food dominated by nutritional science and agricultural economics. Is there room for ethics in our deliberations on food?

It’s important to notice that people gravitate toward the idea of “food ethics” quite naturally. While much interest in food is grounded in personal health and aesthetic enjoyment, our food environment has become saturated with suggestions that we can “do good” by consuming foods that are “fairly traded,” “humanely produced,” or in some other way grown and marketed with methods that promote ethically important ends. In 2006, Michael Pollan framed The Omnivore’s Dilemma as a set of moral choices about what one eats. At the same time, even personal health and aesthetic motives are only a half step away from concerns that have historically been understood to have an ethical dimension. I wrote From Field to Fork in part to expose how thinking of ethics only in terms of how our acts can harm others—a pattern we see in John Stuart Mill’s On Liberty—shapes but also limits the way that we might understand food ethics. If we think of ethics as an activity—as something people do, rather than something people have—we come to the idea that thinking more critically about the total food system could have implications that go far beyond our dietary choices.

Your suggestion of an enlarged role for food ethics challenges the conventional emphasis on enlightened consumer choice. As you mention, Americans are increasingly supportive of fair-trade coffee and humanely raised chickens. Can we make this easy? Will informed purchase lead to an ethical diet?

I certainly don’t want to discourage people from supporting ethically laudable ends through their purchase behavior. Yet even a moment’s reflection alerts us to the trouble that rides along with thinking that our more informed purchases make us more ethical. The creators of South Park nailed the problem when they parodied Prius drivers by announcing a “smug alert.” Food ethics should lead to an awareness of the many ways in which our consumption activity is implicated in ethically problematic practices and institutions. It will not be possible to fix all of these problems with better shopping.

How are these practices and institutions “ethically problematic”?

Farmers and other food industry firms necessarily see themselves as competing for the food consumer’s dollar. On the one hand, this competitive structure has driven them to find ways that cut costs, and many of these savings are passed on to consumers. On average, Americans spend less of their household income on food than any other nation. From an ethical perspective we benefit twice over. We spend the money we save on food on other goods that can improve our quality of life, and eating cheaply is especially important for those with low incomes, because they spend a higher proportion their weekly budget on food. Making food less expensive conforms to an egalitarian ethic that emphasizes improving the lives of those who are relatively less well off. On the other hand, producers often cut costs in places that they shouldn’t. They achieve lower production costs by maximizing the return on their machinery and supplies, for example, and that may impose costs on animals, in the form of compromises to their well-being, to the environment, in the form of reduced biodiversity, or to future generations, in the form of declining soil fertility. It can even introduce false economies by shifting the costs that consumers bear from their food dollar to their health care budget. But because we buy food all the time, short-term thinking leads us to economize on our food, rather than taking a broader view.

So I’m definitely not against enlightened consumer choice, but enlightened consumer choice alone will not address these larger structural problems. What is more, simply spending more on food just shifts the burden right back on the poor. Food ethics requires a conversation on how we should tackle these complex problems above and beyond the way we change our personal diets.

Let me follow-up with a recent headline. The popular grocery store chain, Trader Joe’s, announced that in the near future all the eggs they sell will be cage-free. This appears to be the proper ethical choice for the store and its customers, correct?

Food ethics can take us beyond the simple cause and effect thinking that characterizes a utilitarian endorsement of the “better shopping” strategy. In fact, I think that the metaphor of seeing ethics in terms of a supermarket array of consumption decisions is all too pervasive in contemporary society. As an environmental philosopher, I’ve advocated the “systems perspective” and suggested that a focus on outcomes and consumption generally tends to overlook features that can drive a system toward collapse. But one weakness of the systems approach is that it has very little resonance with the experience of an ordinary person. I am hopeful that food ethics may provide some baby steps toward remedying that situation.

Part of the problem is that the consequences of our choices are often hidden. For instance, recent economic studies suggest that pasture-based, or free-range, agriculture is less efficient than large-scale agricultural operations. Should this matter?

This is a good example. There are environmental issues associated with the total burden of animal production—greenhouse gas emissions, water and soil pollution and the impacts from producing feed—and here, efficiencies matter. Yet pasture-based production of beef and milk can be better for animals, and can have it’s own environmental benefits in utilizing grasses and boosting soils. No one—and here I include consumers, economists and animal scientists—should assume that these trade-offs and tensions can be easily negotiated. I certainly don’t know what the best system is, and I suspect that what makes sense in one place might not make sense somewhere else.

Is the creation of a single ethical framework one of the limitations in our understanding of food systems?

This question suggests that there may be some limitations to the approach that philosophers and other moralists have been peddling for the last 200 years or so. I do think that food ethics has the potential to broaden the appeal of critiques that feminists and critical race theorists have made for the last 25 years. I hope others will follow up on that connection.

Questions about the virtues of various agricultural practices blend easily into long-standing beliefs about the goodness of the farmer. We can trace this tradition back to our nation’s founding where many thinkers and statesmen, especially Thomas Jefferson, held up the agriculturalist as the ideal American. Is there a way to connect the dots between history, values, and contemporary ethics?

Most Americans know that Jefferson praised farmers, but they have no idea why. Jefferson did not think that farmers were “more moral” in every dimension, but he did think that they made better citizens. The reason is that because their livelihood was tied to the land, they had a more natural ability to align their own household interests with that of the nation. Manufacturers and traders could easily relocate elsewhere, and many did in the revolutionary years. Farmers had to stick with it and they had to make things work once independence had been won. There was thus a sense in which farmers could “see” a connection between their household (as a system) and the nation (as a system). Today we have to go further and see our households and our country as elements of both local and global ecosystems. Maybe it’s too much to ask from food ethics, but we have to start someplace.

Perhaps that starting point can focus on the future of our food systems. In a paper on food security, former Secretary of State Madeline Albright noted how peculiar it is “to live in a world where hunger is an endemic problem for half the planet while diet books are best-sellers in the other half.” Is food security as much a moral imperative as it is an economic question.

I like the reference to Albright, because one thing that everyone can see about food is that we, and everyone else, needs to eat. One of the things I wanted to do in From Field to Fork is get people beyond the point of thinking about food simply in terms of feeding hungry people, though. Yes, it’s that, but it’s ironic that many of the global hungry are actually farmers. That should be a tipoff that we have a systems problem.

Let’s circle back to individual behavior. When we talk of limitations in our food systems these are, to state the obvious, large-scale issues. It is all too easy for otherwise engaged citizens—intimidated by the size of the challenge—to throw their hands up in despair.

We face real shortcomings to be sure, but there are a number of initiatives underway to reform and improve our food system: food hubs, community supported agriculture, fair trade, urban farming, the list goes on and on. The point of ethics is to integrate a systemic and reflective conversation on the goals and the means that are pursued by agricultural scientists and entrepreneurial reformers, alike. Let me give you one more example from history. Abraham Lincoln laid out the moral foundation of our current agricultural system when, speaking as a presidential candidate in 1859, he said, “Every blade of grass is a study; and to produce two, where there was but one, is both a profit and a pleasure.” As President, he acted on this principle in creating the Department of Agriculture. This is a federal agency dedicated to farm policy, agricultural research, and food safety. Those obligations are well known. But the department is also charged with community development and hunger relief at home and abroad. Lincoln’s presidential actions implied a more reflective perspective: he brought slave agriculture to an end, it’s productivity and profitability notwithstanding. Many of our current problems can be traced to a single-minded pursuit of productive efficiencies in our food systems. It’s the single-mindedness that food ethics hopes to counteract. And it is in this setting that individual choices and behaviors do matter. At the moment that food goes into our shopping cart (not to mention into our mouths) we all just do the best we can. But we should also resolve to have a larger and more open-ended discussion about where and how change needs to occur. That’s where ethics, as a practical guide for sustainable choices and a barometer for social justice, can help.

Share
Single Page

Get access to 168 years of
Harper’s for only $45.99

United States Canada

CATEGORIES

THE CURRENT ISSUE

August 2018

Combustion Engines

= Subscribers only.
Sign in here.
Subscribe here.

There Will Always Be Fires

= Subscribers only.
Sign in here.
Subscribe here.

The End of Eden

= Subscribers only.
Sign in here.
Subscribe here.

How to Start a Nuclear War

= Subscribers only.
Sign in here.
Subscribe here.

view Table Content

FEATURED ON HARPERS.ORG

Article
Combustion Engines·

= Subscribers only.
Sign in here.
Subscribe here.

On any given day last summer, the smoke-choked skies over Missoula, Montana, swarmed with an average of twenty-eight helicopters and eighteen fixed-wing craft, a blitz waged against Lolo Peak, Rice Ridge, and ninety-six other wildfires in the Lolo National Forest. On the ground, forty or fifty twenty-person handcrews were deployed, alongside hundreds of fire engines and bulldozers. In the battle against Rice Ridge alone, the Air Force, handcrews, loggers, dozers, parachutists, flacks, forecasters, and cooks amounted to some nine hundred people.

Rice Ridge was what is known as a mega-fire, a recently coined term for blazes that cover more than 100,000 acres. The West has always known forest fires, of course, but for much of the past century, they rarely got any bigger than 10,000 acres. No more. In 1988, a 250,000-acre anomaly, Canyon Creek, burned for months, roaring across a forty-mile stretch of Montana’s Bob Marshall Wilderness in a single night. A few decades on, that anomaly is becoming the norm. Rice Ridge, for its part, swept through 160,000 acres.

At this scale, the firefighting operation is run by an incident management team, a group of about thirty specialists drawn from a mix of state and federal agencies and trained in fields ranging from aviation to weather forecasting and accounting to public information. The management teams are ranked according to experience and ability, from type 3 (the least skilled) to type 1 (the most). The fiercest fires are assigned to type 1s. Teams take the name of their incident commander, the field general, and some of those names become recognizable, even illustrious, in the wildfire-fighting community. One such name is that of Greg Poncin, who is to fire commanders what Wyatt Earp was to federal marshals.

Smoke from the Lolo Peak fire (detail) © Laura Verhaeghe
Article
There Will Always Be Fires·

= Subscribers only.
Sign in here.
Subscribe here.

The pinhal interior, a wooded region of hills and narrow hollows in rural central Portugal, used to be farmland. Well into the latter half of the past century, the fields were worked by peasants from the old stone villages. Portugal was poor and isolated, and the pinhal interior particularly so; when they could, the peasants left. There is electricity and running water now, but most of the people have gone. The fields have been taken over by trees. Each year the forest encroaches farther, and each year the villages grow more lonely. There are remnants of the earlier life, though, and amid the trees the holdouts of the older generations still work a few small fields. The pinhal interior cannot yet be called wilderness, then, and that, in large part, is why it burns.

Thousands of fires burn in the region each summer, almost all of them started not by lightning or some other natural spark but by the remaining Portuguese. (The great majority of the blazes are started unintentionally, though not all.) The pinhal interior—the name means “interior pine forest,” though today there is at least as much eucalyptus as pine—stretches along a sort of climate border between the semiarid Iberian interior and the wet influence of the Atlantic; vegetation grows exceptionally well there, and in the summers fire conditions are ideal. Still, most of the burns are quickly contained, and although they have grown larger in recent years, residents have learned to pay them little mind. The creeping fire that began in the dry duff and twigs of an oak grove on June 17 of last year, in the district of Pe­drógão Grande, therefore occasioned no panic.

A local woman, Dora da Silva Co­sta, drove past the blaze in the midafternoon, by which time it had entered a stand of pines. Firefighters were on hand. “There were no people in the streets,” Costa told me. “It was just another fire.” She continued on her way. It was a Saturday, and she had brought her two young sons to visit their older cousin in Vila Facaia, the village of small farms in which she’d been raised.

Firefighters near Pedrógão Grande (detail) © Pablo Blazquez Dominguez/Getty Images
Article
The End of Eden·

= Subscribers only.
Sign in here.
Subscribe here.

On a blistering morning in July 2017, Ghazi Luaibi rose before dawn and set out in a worn black sedan from his home in Zubair, a town of concrete low-rises in southern Iraq. He drove for a while along sandy roads strewn with plastic bags. On the horizon, he could see gas flares from the oil refineries, pillars of amber flame rising into the sky. As he approached Basra, the largest city in the province, desert scrub gave way to empty apartment blocks and rows of withered palms. Though the sun had barely risen, the temperature was already nearing 100 degrees Fahrenheit. The previous year, Basra had registered one of the highest temperatures ever reliably recorded on earth: about 129 degrees, hot enough to cause birds to drop from the sky.

Ghazi, a sixty-two-year-old with stooped shoulders, an ash-gray beard, and lively brown eyes, would have preferred to stay home and wait out the heat. But he hadn’t had much of a choice. He was the president of the local council of Mandaeans, members of a gnostic religion that appeared in Mesopotamia in the early centuries ad. Today marked the beginning of their new year, and Ghazi, who was born into the Mandaean priestly class, was responsible for making sure everything went smoothly: he needed to find a tent to shield worshippers from the sun and, most importantly, a location near flowing water where they could carry out the ceremony.

Mandaean holidays are celebrated with a mass baptism, a ritual that is deeply rooted in their scripture and theology. Mandaeans follow the teachings of Yahia Yuhana, known to Christians as John the Baptist. Water is central to their religion. They believe that all life originates in the World of Light, a spiritual realm that is the starting point for a great river known as Yardana, or Jordan. Outside the World of Light lie the lifeless, stagnant waters of the World of Darkness. According to one version of the Mandaean creation myth, a demiurge named Ptahil set out to shape a new world from the World of Darkness, which became the material world we inhabit today. Once the world was complete, Ptahil sculpted Adam, the first man, from the same dark waters as the earth, but his soul came from the World of Light. In Mandaean scripture, rivers are manifestations of the World of Light, coursing from the heavenly Jordan to the earth to purify it. To be baptized is to be immersed in this divine realm.

Basra General Hospital (detail) July 2017 © Alex Potter
Article
How to Start a Nuclear War·

= Subscribers only.
Sign in here.
Subscribe here.

Serving as a US Air Force launch control officer for intercontinental missiles in the early Seventies, First Lieutenant Bruce Blair figured out how to start a nuclear war and kill a few hundred million people. His unit, stationed in the vast missile fields at Malmstrom Air Force Base, in Montana, oversaw one of four squadrons of Minuteman II ­ICBMs, each missile topped by a W56 thermonuclear warhead with an explosive force of 1.2 megatons—eighty times that of the bomb that destroyed Hiroshima. In theory, the missiles could be fired only by order of the president of the United States, and required mutual cooperation by the two men on duty in each of the launch control centers, of which there were five for each squadron.

In fact, as Blair recounted to me recently, the system could be bypassed with remarkable ease. Safeguards made it difficult, though not impossible, for a two-man crew (of either captains or lieutenants, some straight out of college) in a single launch control center to fire a missile. But, said Blair, “it took only a small conspiracy”—of two people in two separate control centers—to launch the entire squadron of fifty missiles, “sixty megatons targeted at the Soviet Union, China, and North Korea.” (The scheme would first necessitate the “disabling” of the conspirators’ silo crewmates, unless, of course, they, too, were complicit in the operation.) Working in conjunction, the plotters could “jury-rig the system” to send a “vote” by turning keys in their separate launch centers. The three other launch centers might see what was happening, but they would not be able to override the two votes, and the missiles would begin their firing sequence. Even more alarmingly, Blair discovered that if one of the plotters was posted at the particular launch control center in overall command of the squadron, they could together format and transmit a “valid and authentic launch order” for general nuclear war that would immediately launch the entire US strategic nuclear missile force, including a thousand Minuteman and fifty-four Titan missiles, without the possibility of recall. As he put it, “that would get everyone’s attention, for sure.” A more pacifically inclined conspiracy, on the other hand, could effectively disarm the strategic force by formatting and transmitting messages invalidating the presidential launch codes.

When he quit the Air Force in 1974, Blair was haunted by the power that had been within his grasp, andhe resolved to do something about it. But when he started lobbying his former superiors, he was met with indifference and even active hostility. “I got in a fair scrap with the Air Force over it,” he recalled. As Blair well knew, there was supposed to be a system already in place to prevent that type of unilateral launch. The civilian leadership in the Pentagon took comfort in this, not knowing that the Strategic Air Command, which then controlled the Air Force’s nuclear weapons, had quietly neutralized it.

This reluctance to implement an obviously desirable precaution might seem extraordinary, but it is explicable in light of the dominant theme in the military’s nuclear weapons culture: the strategy known as “launch under attack.” Theoretically, the president has the option of waiting through an attack before deciding how to respond. But in practice, the system of command and control has been organized so as to leave a president facing reports of incoming missiles with little option but to launch. In the words of Lee Butler, who commanded all US nuclear forces at the end of the Cold War, the system the military designed was “structured to drive the president invariably toward a decision to launch under attack” if he or she believes there is “incontrovertible proof that warheads actually are on the way.” Ensuring that all missiles and bombers would be en route before any enemy missiles actually landed meant that most of the targets in the strategic nuclear war plan would be destroyed—thereby justifying the purchase and deployment of the massive force required to execute such a strike.

Among students of nuclear command and control, this practice of precluding all options but the desired one is known as “jamming” the president. Blair’s irksome protests threatened to slow this process. When his pleas drew rejection from inside the system, he turned to Congress. Eventually the Air Force agreed to begin using “unlock codes”—codes transmitted at the time of the launch order by higher authority without which the crews could not fire—on the weapons in 1977. (Even then, the Navy held off safeguarding its submarine-launched nuclear missiles in this way for another twenty years.)

Following this small victory, Blair continued to probe the baroque architecture of nuclear command and control, and its extreme vulnerability to lethal mishap. In the early Eighties, while working with a top-secret clearance for the Office of Technology Assessment, he prepared a detailed report on such shortcomings. The Pentagon promptly classified it as SIOP-ESI—a level higher than top secret. (SIOP stands for Single Integrated Operational Plan, the US plan for conducting a nuclear war. ESI stands for Extremely Sensitive Information.) Hidden away in the Pentagon, the report was withheld from both relevant senior civilian officials and the very congressional committees that had commissioned it in the first place.

From positions in Washington’s national security think tanks, including the Brookings Institution, Blair used his expertise and scholarly approach to gain access to knowledgeable insiders at the highest ranks, even in Moscow. On visits to the Russian capital during the halcyon years between the Cold War’s end and the renewal of tensions in the twenty-first century, he learned that the Soviet Union had actually developed a “dead hand” in ultimate control of their strategic nuclear arsenal. If sensors detected signs of an enemy nuclear attack, the USSR’s entire missile force would immediately launch with a minimum of human intervention—in effect, the doomsday weapon that ends the world in Dr. Strangelove.

Needless to say, this was a tightly held arrangement, known only to a select few in Moscow. Similarly chilling secrets, Blair continued to learn, lurked in the bowels of the US system, often unknown to the civilian leadership that supposedly directed it. In 1998, for example, on a visit to the headquarters of Strategic Command (­STRATCOM), the force controlling all US strategic nuclear weapons, at Offutt Air Force Base, near Omaha, Nebraska, he discovered that the ­­­STRATCOM targeting staff had unilaterally chosen to interpret a presidential order on nuclear targeting in such a way as to reinsert China into the ­SIOP, from which it had been removed in 1982, thereby provisionally consigning a billion Chinese to nuclear immolation. Shortly thereafter, he informed a senior White House official, whose reaction Blair recalled as “surprised” and “befuddled.”

In 2006, Blair founded Global Zero, an organization dedicated to ridding the world of nuclear weapons, with an immediate goal of ending the policy of launch under attack. By that time, the Cold War that had generated the ­SIOP and all those nuclear weapons had long since come to an end. As a result, part of the nuclear war machine had been dismantled—warhead numbers were reduced, bombers taken off alert, weapons withdrawn from Europe. But at its heart, the system continued unchanged, officially ever alert and smooth running, poised to dispatch hundreds of precisely targeted weapons, but only on receipt of an order from the commander in chief.

Bombhead, by Bruce Conner (detail) © Conner Family Trust, San Francisco, and ARS, New York City. Courtesy Kohn Gallery, Los Angeles

Amount of aid Connecticut agreed in May to provide Bridgewater Associates, the world’s largest hedge fund:

$22,000,000

A survey of national narcissism found that Russians see themselves as responsible for 61 percent of world history, whereas the Swiss put themselves at 11 percent

Marvel Entertainment's CEO exerts influence over the VA; Mike Pence lays out plans for The Space Force; Paul Manafort's trial reveals his tax evasion (and much more)

Subscribe to the Weekly Review newsletter. Don’t worry, we won’t sell your email address!

HARPER’S FINEST

Happiness Is a Worn Gun

By

Illustration by Stan Fellows

Illustration by Stan Fellows

“Nowadays, most states let just about anybody who wants a concealed-handgun permit have one; in seventeen states, you don’t even have to be a resident. Nobody knows exactly how many Americans carry guns, because not all states release their numbers, and even if they did, not all permit holders carry all the time. But it’s safe to assume that as many as 6 million Americans are walking around with firearms under their clothes.”

Subscribe Today