Suggestion — October 4, 2013, 2:30 pm

The End of Illth

In search of an economy that won’t kill us

A coal miner two days after the April 5, 2010, explosion at Massey Energy Company's Upper Big Branch mine in Montcoal, West Virginia. © AP Photo/ Bob Bird

A coal miner two days after the April 5, 2010, explosion at Massey Energy Company’s Upper Big Branch mine in Montcoal, West Virginia. © AP Photo/ Bob Bird

On April 5, 2010, a fireball ripped through underground shafts at the Upper Big Branch mine in Montcoal, West Virginia, killing twenty-nine miners. It was the worst mining disaster in the United States in over forty years. When I heard the news on the radio that morning, my first thought was, “I bet it’s a Massey mine.” And it was. Under the ruthless leadership of CEO Don Blankenship, Massey Energy had unapologetically accumulated one of the worst safety records of any coal company in the country.

Between January 2009 and the day of the explosion, the U.S. Mine Safety and Health Administration (MSHA) cited the Upper Big Branch mine for 639 violations, and in the past decade, fifty-four miners have been killed in Massey mines. Jeff Harris, a former Massey employee who quit to work for a union mine, told a Senate committee three weeks after the explosion that the company routinely chose productivity over safety. “Soon as the inspector would leave the property,” Harris said of Upper Big Branch, “they jerk all the ventilation back down and start mining coal.” Unfortunately, as a Labor Department investigation later revealed, the ventilation system could have played a significant role in preventing the methane buildup that ignited the explosion on April 5.

Testifying before the House Education, Labor, and Pensions Committee, surviving miners from Upper Big Branch described repeated ventilation violations that left flammable coal dust collecting on conveyor belts. That was nothing new in a Massey mine. In the fall of 2005, Blankenship sent a memo to employees that read, “If any of you have been asked by your group presidents, your supervisors, engineers or anyone else to do anything other than run coal (i.e., build overcasts, do construction jobs, or whatever) you need to ignore them and run coal.” That “whatever” might have included stopping a conveyor belt long enough to remove combustible coal waste — on January 19, 2006, a fire broke out along a belt at a Massey mine in West Virginia, killing two men.

[1] In response to the naming of the Wildcat Coal Lodge, writer Wendell Berry withdrew all of the personal papers he had donated to the university’s special-collections library.

Twenty-four days after the tragedy at Upper Big Branch, two more miners were killed, this time in my home state of Kentucky, because a roof collapsed on them 500 feet underground. That occurred at the Dotiki mine, which seemed to follow Blankenship’s premise that installing support beams to stabilize roofs gets in the way of running coal; the mine had been cited for 2,973 violations in the previous five years. (The Dotiki mine was operated by a subsidiary of Alliance Resources, whose chief executive, Joe Craft, had recently brokered a deal with my own employer, the University of Kentucky. Last October, Craft pledged $7 million to build a new dorm for the basketball team if, and only if, UK agreed to name the building the Wildcat Coal Lodge. The dorm opened last year.)[1]

As I pondered all of these violations and deaths, a question formed over and over in my mind: What if the workers themselves had owned those mines? That question led to others. Unionized mines do a better job of maintaining worker safety than nonunion ones; would a worker-owned mine be better still? Would the ventilation curtains have remained in place even after the inspectors left? Would the workers have sent themselves a memo, like Don Blankenship’s, pointing out the importance of profits over their own lives? If the miners themselves had owned the mine, would they still be alive?

[2] I also discovered an outfit far beyond the southern mountains, in Wyoming, called the Kiewit Mining Group, whose website touts its “broad-based employee ownership.” I checked out the safety record of its Buckskin Mine in Campbell County. According to the U.S. Mine Safety and Health Administration, Kiewit harvests on average about 15 million tons of coal there each year, and in sixteen years has been cited for only thirty-nine injuries and zero deaths.

I began doing some research to see if any such mines exist among the coalfields of Appalachia, but I could find none. From 1917 to 1927, however, a cooperative mining town called Himlerville did exist in Martin County, Kentucky, across the Tug Fork River from the notorious company towns of “Bloody” Mingo County, West Virginia.[2] The Himler Coal Company was founded by a Hungarian coal miner named Henrich Himler on the premise that the workers would be the stockholders and the stockholders would be the workers. Each year the company’s profits were distributed as dividends, and every miner, no matter his position, shared equally in stock bonuses.

At first Himlerville flourished, with handsome cottages, gardens, and indoor plumbing. It had a library, an auditorium, a school, and a bake shop. Residents didn’t suffer from typhoid, as they did across the river in Mingo County, nor did gun thugs patrol the grounds to intimidate miners and thwart attempts at collective bargaining. By 1922, the Himler Coal Company had raised its capital base from $500,000 to $2 million, and so decided to open two new mines. But then coal prices plunged, and the railroads brought in competition from larger corporations. “In 1927, the company was sold at auction to private capitalists,” writes historian Ronald Eller, “and the only effort at cooperative mining in the southern mountains came to an end.” Today, only Henrich Himler’s dilapidated Victorian home, in what’s now called Beauty, Kentucky, stands as evidence of the experiment.

One can say today about an Appalachian deep mine what Sarah Ogan Gunning sang almost eighty years ago in her anthem “Come All You Coal Miners”:

Coal mining is the most dangerous work in our land today.
Plenty of dirty slaving work and very little pay.

Trapper Boy, Turkey Knob Mine, MacDonald, West Virginia, October 1908. Courtesy the Library of Congress

Trapper Boy, Turkey Knob Mine, MacDonald, West Virginia, October 1908. Courtesy the Library of Congress

The pay has gotten better, thanks to unions, but the work remains deadly. And Gunning, who had watched children starve to death in coal camps, meant it when she sang the last line: “Let’s sink this capitalist system to the darkest pits of hell.” Of course in today’s America, this sentiment represents the worst kind of heresy: a denial of capitalism’s benevolent hand and the free market’s capacious ability to best know our human needs. It’s socialism, wealth-spreading, devil-worship.

But what if it isn’t? Pause for a moment to consider how this country might look if we did shift wealth away from predatory lenders and speculators, toward real workers who produce real wealth, in the form of goods and services? What if this shift represented a radical and ethical form of democracy — one grounded in trust, decent work, and marketplace morality?

The financial crisis of 2008 had a long gestation period that can be traced back to 1783, when Alexander Hamilton persuaded Continental Army soldiers, desperate for cash, to sell their war bonds to his speculating friends at one-thirtieth of their value. In the earliest days of the republic, Hamilton and financier–politician Robert Morris were making shady deals to funnel American wealth to the banking class of New York. Hamilton wanted to centralize the country’s wealth and power as fervently as his nemesis Thomas Jefferson wanted a decentralized nation of agrarian, self-sufficient wards. But of course we adopted Hamilton’s vision, not Jefferson’s, and as a result the United States now has the largest income gap of any country in the northern hemisphere — one that is now wider than at any point in our country’s history.

In their 2009 book, The Spirit Level, epidemiologists Richard Wilkinson and Kate Pickett concluded that every societal problem, without exception, can be tied directly to income inequality. The United States has higher levels of mental illness, infant mortality, obesity, violence, incarceration, and substance abuse than almost all other “developed” countries. And we have the worst environmental record in the world. When they died, the twenty-nine West Virginia miners were digging coal that the rest of us consume twice as fast as Americans did in the 1970s. Yet still we leave unquestioned the overarching goal of infinite economic growth on a planet of finite resources. The American economist Kenneth Boulding once remarked, “Anyone who believes that exponential growth can go on forever is either a madman or an economist.” But as we listen daily to the president, to members of Congress, and to the financial analysts who sail by on cable news, the dominant message is that endless economic growth is this country’s singular destiny.

In his biography of Hamilton, Ron Chernow wrote, “Today, we are indisputably the heirs to Hamilton’s America, and to repudiate his legacy is, in many ways, to repudiate the modern world.” Exactly. We are indeed Hamilton’s heirs, and to repudiate his legacy will mean repudiating what modern capitalism has brought us: toxic loans, toxic securities, toxic energy sources, and toxic growth.

But what if we replaced our Hamiltonian economy with a Jeffersonian one? Or, put in other terms, what if we took as our model not an economy of unchecked growth, but one based on the natural laws of the watershed? By its very nature, a watershed is self-sufficient, symbiotic, conservative, decentralized, and diverse. It circulates its own wealth over and over. It generates no waste, and doesn’t “externalize” the cost of “production” onto other watersheds. In a watershed, all energy is renewable and all resource use is sustainable. The watershed purifies air and water, holds soil in place, enriches humus, and sequesters carbon. It represents both a metaphor and a model for an entirely new definition of economy, whereby our American system of exchange in the realms of wealth and energy is brought into line with the most important and inescapable economy of nature.

Previous PageNext Page
1 of 4
is the author of, most recently, The Embattled Wilderness, and the editor of The Guy Davenport Reader. He lives in Nonesuch, Kentucky.

More from Erik Reece:

From the December 2005 issue

Jesus without the miracles

Thomas Jefferson’s Bible and the Gospel of Thomas

From the April 2005 issue

Death of a mountain

Radical strip mining and the leveling of Appalachia

Get access to 168 years of
Harper’s for only $45.99

United States Canada

CATEGORIES

THE CURRENT ISSUE

October 2018

Checkpoint Nation

= Subscribers only.
Sign in here.
Subscribe here.

view Table Content

FEATURED ON HARPERS.ORG

Article
Checkpoint Nation·

= Subscribers only.
Sign in here.
Subscribe here.

Laura Sandoval threaded her way through idling taxis and men selling bottles of water toward the entrance of the Cordova International Bridge, which links Ciudad Juárez, Mexico, to El Paso, Texas. Earlier that day, a bright Saturday in December 2012, Sandoval had crossed over to Juárez to console a friend whose wife had recently died. She had brought him a few items he had requested—eye drops, the chimichangas from Allsup’s he liked—and now that her care package had been delivered, she was in a hurry to get back to the Texas side, where she’d left her car. She had a three-hour drive to reach home, in the mountains in New Mexico, and she hated driving in the dark.

Sandoval took her place in the long line of people waiting to have their passports checked by US Customs and Border Protection (CBP). When it was her turn, she handed her American passport to a customs officer and smiled amicably, waiting for him to wave her through. But the officer said she had been randomly selected for additional screening. Sandoval was led to a secondary inspection area nearby, where two more officers patted her down. Another walked toward her with a drug-sniffing dog, which grew agitated as it came closer, barking and then circling her legs. Because the dog had “alerted,” the officer said, Sandoval would now have to undergo another inspection.

Checkpoint on I-35 near Encinal, Texas (detail) © Gabriella Demczuk
Article
The Printed Word in Peril·

= Subscribers only.
Sign in here.
Subscribe here.

In February, at an event at the 92nd Street Y’s Unterberg Poetry Center in New York, while sharing the stage with my fellow British writer Martin Amis and discussing the impact of screen-based reading and bidirectional digital media on the Republic of Letters, I threw this query out to an audience that I estimate was about three hundred strong: “Have any of you been reading anything by Norman Mailer in the past year?” After a while, one hand went up, then another tentatively semi-elevated. Frankly I was surprised it was that many. Of course, there are good reasons why Mailer in particular should suffer posthumous obscurity with such alacrity: his brand of male essentialist braggadocio is arguably extraneous in the age of Trump, Weinstein, and fourth-wave feminism. Moreover, Mailer’s brilliance, such as it was, seemed, even at the time he wrote, to be sparks struck by a steely intellect against the tortuous rocks of a particular age, even though he labored tirelessly to the very end, principally as the booster of his own reputation.

It’s also true that, as J. G. Ballard sagely remarked, for a writer, death is always a career move, and for most of us the move is a demotion, as we’re simultaneously lowered into the grave and our works into the dustbin. But having noted all of the above, it remains the case that Mailer’s death coincided with another far greater extinction: that of the literary milieu in which he’d come to prominence and been sustained for decades. It’s a milieu that I hesitate to identify entirely with what’s understood by the ringing phrase “the Republic of Letters,” even though the overlap between the two was once great indeed; and I cannot be alone in wondering what will remain of the latter once the former, which not long ago seemed so very solid, has melted into air.

What I do feel isolated in—if not entirely alone in—is my determination, as a novelist, essayist, and journalist, not to rage against the dying of literature’s light, although it’s surprising how little of this there is, but merely to examine the great technological discontinuity of our era, as we pivot from the wave to the particle, the fractal to the fungible, and the mechanical to the computable. I first began consciously responding, as a literary practitioner, to the manifold impacts of ­BDDM in the early 2000s—although, being the age I am, I have been feeling its effects throughout my working life—and I first started to write and speak publicly about it around a decade ago. Initially I had the impression I was being heard out, if reluctantly, but as the years have passed, my attempts to limn the shape of this epochal transformation have been met increasingly with outrage, and even abuse, in particular from my fellow writers.

As for my attempts to express the impact of the screen on the page, on the actual pages of literary novels, I now understand that these were altogether irrelevant to the requirement of the age that everything be easier, faster, and slicker in order to compel the attention of screen viewers. It strikes me that we’re now suffering collectively from a “tyranny of the virtual,” since we find ourselves unable to look away from the screens that mediate not just print but, increasingly, reality itself.

Photograph (detail) by Ellen Cantor from her Prior Pleasures series © The artist. Courtesy dnj Gallery, Santa Monica, California
Article
Nothing but Gifts·

= Subscribers only.
Sign in here.
Subscribe here.

If necessity is the stern but respectable mother of invention, then perhaps desperation is the derelict father of subterfuge. That was certainly the case when I moved to Seattle in 1979.

Though I’d lived there twice during the previous five years, I wasn’t prepared for the economic boom I found upon this latest arrival. Not only had rent increased sharply in all but the most destitute neighborhoods, landlords now routinely demanded first, last, and a hefty security deposit, which meant I was short by about fifty percent. Over the first week or so, I watched with mounting anxiety as food, gas, and lodging expenses reduced the meager half I did have to a severely deficient third. To make matters even more nerve-racking, I was relocating with my nine-year-old son, Ezra. More than my well-being was at stake.

A veteran of cold, solitary starts in strange cities, I knew our best hope wasn’t the classifieds, and certainly not an agency, but the serendipity of the streets—handmade for rent signs, crowded bulletin boards in laundromats and corner grocery stores, passersby on the sidewalk; I had to exploit every opportunity that might present itself, no matter how oblique or improbable. In Eastlake, at the edge of Lake Union between downtown Seattle and the University District, I spied a shabby but vacant one-story house on the corner of a block that was obviously undergoing transition—overgrown lots and foundation remnants where other houses once stood—and that had at least one permanent feature most right-minded people would find forbidding: an elevated section of Interstate 5 just across the street, attended by the incessant roar of cars and trucks. The house needed a new roof, a couple of coats of paint, and, judging by what Ezra and I could detect during a furtive inspection, major repair work inside, including replacing damaged plaster-and-lath walls with sheetrock. All of this, from my standpoint, meant that I might have found a solution to my dilemma.

The next step was locating the owner, a roundabout process that eventually required a trip to the tax assessor’s office. I called the person listed on the rolls and made an appointment. Then came the moment of truth, or, more precisely, untruth, when dire circumstance begot strategic deception. I’d never renovated so much as a closet, but that didn’t stop me from declaring confidently that I possessed both the skills and the willingness to restore the entire place to a presentable—and, therefore, rentable—state in exchange for being able to live there for free, with the length of stay to be determined as work progressed. To my immense relief, the pretense was well received. Indeed, the owner also seemed relieved, if a bit surprised, that he’d have seemingly trustworthy tenants; homeless people who camped beneath the freeway, he explained, had repeatedly broken into the house and used it for all manner of depravity. Telling myself that inspired charlatanry is superior to mundane trespassing—especially this instance of charlatanry, which would yield some actual good—I accepted the keys from my new landlord.

Photograph (detail) © Larry Towell/Magnum Photos
Article
Among Britain’s Anti-Semites·

= Subscribers only.
Sign in here.
Subscribe here.

This is the story of how the institutions of British Jewry went to war with Jeremy Corbyn, the leader of the Labour Party. Corbyn is another feather in the wind of populism and a fragmentation of the old consensus and politesse. He was elected to the leadership by the party membership in 2015, and no one was more surprised than he. Between 1997 and 2010, Corbyn voted against his own party 428 times. He existed as an ideal, a rebuke to the Blairite leadership, and the only wise man on a ship of fools. His schtick is that of a weary, kindly, socialist Father Christmas, dragged from his vegetable patch to create a utopia almost against his will. But in 2015 the ideal became, reluctantly, flesh. Satirists mock him as Jesus Christ, and this is apt. But only just. He courts sainthood, and if you are very cynical you might say that, like Christ, he shows Jews what they should be. He once sat on the floor of a crowded train, though he was offered a first-class seat, possibly as a private act of penance to those who had, at one time or another, had no seat on a train.

When Corbyn became leader of the Labour Party, the British media, who are used to punching socialists, crawled over his record and found much to alarm the tiny Jewish community of 260,000. Corbyn called Hez­bollah “friends” and said Hamas, also his “friends,” were devoted “to long-term peace and social justice.” (He later said he regretted using that language.) He invited the Islamist leader Raed Salah, who has accused Jews of killing Christian children to drink their blood, to Parliament, and opposed his extradition. Corbyn is also a patron of the Palestine Solidarity Campaign and a former chair of Stop the War, at whose rallies they chant, “From the river to the sea / Palestine will be free.” (There is no rhyme for what will happen to the Jewish population in this paradise.) He was an early supporter of the Boycott, Divestment, and Sanctions (BDS) movement and its global campaign to delegitimize Israel and, through the right of return for Palestinians, end its existence as a Jewish state. (His office now maintains that he does not support BDS. The official Labour Party position is for a two-state solution.) In the most recent general election, only 13 percent of British Jews intended to vote Labour.

Corbyn freed something. The scandals bloomed, swiftly. In 2016 Naz Shah, Labour MP for Bradford West, was suspended from the party for sharing a Facebook post that suggested Israel be relocated to the United States. She apologized publicly, was reinstated, and is now a shadow women and equalities minister. Ken Livingstone, the former mayor of London and a political supporter of Corbyn, appeared on the radio to defend Shah and said, “When Hitler won his election in 1932, his policy then was that Jews should be moved to Israel. He was supporting Zionism before he went mad and ended up killing six million Jews.” For this comment, Livingstone was suspended from the party.

A protest against anti-Semitism in the Labour Party in Parliament Square, London, March 26, 2018 (detail) © Yui Mok/PA Images/Getty Images

Chance that a country to which the U.S. sells arms is cited by Amnesty International for torturing its citizens:

1 in 2

A newly discovered lemur (Avahi cleesei) was named after the comedian John Cleese.

Kavanaugh is confirmed; Earth’s governments are given 12 years to get climate change under control; Bansky trolls Sotheby’s

Subscribe to the Weekly Review newsletter. Don’t worry, we won’t sell your email address!

HARPER’S FINEST

Happiness Is a Worn Gun

By

Illustration by Stan Fellows

Illustration by Stan Fellows

“Nowadays, most states let just about anybody who wants a concealed-handgun permit have one; in seventeen states, you don’t even have to be a resident. Nobody knows exactly how many Americans carry guns, because not all states release their numbers, and even if they did, not all permit holders carry all the time. But it’s safe to assume that as many as 6 million Americans are walking around with firearms under their clothes.”

Subscribe Today