Get Access to Print and Digital for $23.99 per year.
Subscribe for Full Access
March 2015 Issue [Readings]

In Regulation Nation

Adjust

By David Graeber, from The Utopia of Rules, published last month by Melville House. Graeber is the author of Debt: The First 5,000 Years. His most recent article for Harper’s Magazine, “Army of Altruists,” appeared in the January 2007 issue.

Nobody thinks much about bureaucracy anymore. But in the middle of the twentieth century, particularly in the late Sixties and early Seventies, the word was everywhere. It dominated works of sociology with grandiose titles like A General Theory of Bureaucracy, popular paperbacks like The Peter Principle: Why Things Always Go Wrong, and novels and films such as Joseph Heller’s Something Happened and Jacques Tati’s Playtime. Everyone seemed to feel that the foibles and absurdities of bureaucratic life were among the defining features of modern existence and, as such, worth discussing.

Today, the subject rarely comes up; perhaps we’ve simply become habituated. When we do discuss bureaucracy, we still use terms established in the Sixties and Seventies. The social movements of the Sixties were, on the whole, left-wing in inspiration, but they were also rebellions against the bureaucratic mind-set, the gray functionalism of both state-capitalist and state-socialist regimes, the soul-destroying conformity of the postwar welfare states. In the face of social control, Sixties rebels stood for individual expression and spontaneous conviviality.

With the collapse of the old welfare states, this kind of rebellion has come to seem decidedly quaint. As the right has adopted the language of anti-bureaucratic individualism, insisting on “market solutions” to every social problem, the mainstream left has limited itself to salvaging remnants of the old welfare state. It has acquiesced to — and often spearheaded — traditionally right-wing attempts to make government efforts more “efficient,” whether through the privatization of services or the incorporation of “market principles,” “market incentives,” and marketbased “accountability processes.” The result has been political catastrophe.

At least the right has a critique of bureaucracy. Its origins may be found in the nineteenth century, when liberal thinkers argued that Western civilization was undergoing a gradual, uneven, but inevitable transformation from the rule of warrior-elites to a society of liberty, equality, and enlightened commercial self-interest. In the wake of the French Revolution, absolutist states were giving way to markets, religious faith to scientific understanding, and fixed orders and noble ranks to free contracts between individuals.

The right-wing argument goes one step further. Ludwig von Mises, the exiled Austrian aristocrat and economic theorist who was its greatest twentieth-century exponent, argued in his 1944 book, Bureaucracy, that systems of government administration could never organize information with anything like the efficiency of impersonal market-pricing mechanisms, and that the administrators of social programs would end up destroying the political basis of democracy by forming powerful blocs against elected officials. Even well-meaning bureaucrats would do more harm than good.

The idea that the market is somehow opposed to and independent of government has been used at least since the nineteenth century to justify laissez-faire economic policies, but such policies never actually have the effect of lessening the role of government. In late-nineteenth-century England, for instance, an increasingly liberal society did not lead to a reduction of state bureaucracy but the opposite: an endlessly mushrooming array of legal clerks, registrars, inspectors, notaries, and police officials — the very people who made possible the liberal dream of a world of free contract between autonomous individuals. It turned out that maintaining a free-market economy required considerably more paperwork than a Louis XIV–style absolutist monarchy. The same effect could be seen in America during Ronald Reagan’s presidency, or in Russia after the fall of the Soviet Union, where, from 1994 to 2002, the number of civil servants jumped by some quarter million.

Indeed, this paradox can be observed so regularly that I think we are justified in treating it as a general sociological principle. Let’s call it the Iron Law of Liberalism: Any market reform or government initiative intended to reduce red tape and promote market forces will ultimately increase the number of regulations and bureaucrats, as well as the amount of paperwork, that the government employs. Emile Durkheim was already observing this tendency at the turn of the twentieth century, and fifty years later even right-wing critics like F. A. Hayek were willing to admit that markets don’t really regulate themselves: they require an army of administrators to keep them going.

Still, conservative populists recognized that making a target of bureaucrats was almost always effective, whatever the reality. Hence the 1968 presidential campaign of Alabama governor George Wallace, in which he continually maligned “pointy-headed bureaucrats” living off the taxes of hardworking citizens. He was one of the first politicians to create a national platform for this argument, which was adopted a generation later across the political spectrum. Working-class Americans now generally believe government to comprise two sorts of people: “politicians,” who are blustering crooks and liars but can at least occasionally be voted out of office, and “bureaucrats,” who are condescending elitists and almost impossible to uproot. The right-wing argument tends to assume a kind of tacit alliance between a parasitic poor (in America usually pictured in overtly racist terms) and equally parasitic self-righteous officials who subsidize the poor using other people’s money. Even the mainstream left now offers little more than a watered-down version of this language. Bill Clinton, for instance, spent so much of his career bashing civil servants that after the Oklahoma City bombing in 1995, he felt he had to remind Americans that public servants were human beings, too.

In America — and increasingly in the rest of the world — the only alternative to “bureaucracy” is now “the market.” Sometimes this is taken to mean that the government should be run more like a business, other times that we should get bureaucrats out of the way and let the magic of the marketplace provide its own solutions. “Democracy” has become a synonym for “the market,” just as “bureaucracy” has become one for “government interference.”

It wasn’t always so. For much of the nineteenth century, the economy of the United States, like that of Britain, was largely built on small family firms and high finance. But America’s advent as a world power at the end of the century coincided with the rise of a distinctly American form: corporate — that is to say, bureaucratic — capitalism. The commonly held view in the country at the time was that the modern corporation had emerged when public bureaucratic techniques were applied to the private sector — techniques that were thought to be necessities when operating on a large scale, since they were more efficient than the networks of personal or informal connections through which family firms operated. Max Weber observed in the early twentieth century that Americans were particularly inclined to conflate public and private bureaucracies. They did not complain that government should be run more like a business; they simply assumed that governments and big businesses were already being run the same way.

Americans often seem embarrassed by the fact that, on the whole, we’re really quite good at bureaucracy. It doesn’t fit our American self-image. We’re supposed to be self-reliant individualists. But it’s impossible to deny that for well over a century the United States has been a profoundly bureaucratic society. When taking over the reins of the global economic and political order from Great Britain after World War II, for instance, the United States was particularly concerned with creating structures of international administration. Right away it set up the world’s first thoroughly planetary bureaucratic institutions: the United Nations and the Bretton Woods institutions (the International Monetary Fund, World Bank, and General Agreement on Tariffs and Trade). The British Empire had never attempted anything like this. Even when Britain created large corporations like the East India Company, its goal was either to facilitate trade with other nations or to conquer them. The Americans attempted to administer everything and everyone.

If Americans are able to overlook their awkward preeminence in this field, it is probably because most of our bureaucratic habits and sensibilities — the clothing, the language, the design of forms and offices — emerged from the private sector. When novelists and sociologists described the “Organization Man,” or the “Man in the Gray Flannel Suit” — the soulless, conformist U.S. counterpart to the Soviet apparatchik — they were not talking about functionaries in the Census Bureau; they were picturing corporate middle management.

Bureaucratic structures and techniques first began to intervene conspicuously in ordinary people’s lives in the Thirties, through programs like Social Security and the Works Progress Administration. The idea that “bureaucrat” could be assumed to be a synonym for “civil servant” can be traced back to this time. But from the very beginning, Roosevelt’s New Dealers worked in close coordination with, and absorbed much of their style and sensibilities from, the battalions of lawyers, engineers, and corporate bureaucrats employed by firms like Ford, Coca-Cola, and Procter & Gamble. As the United States shifted to a war footing in the Forties, so did the gargantuan bureaucracy of the U.S. military. The need to preserve or develop certain domestic industries for military purposes created an alliance between military and corporate bureaucrats that has allowed the U.S. government to engage in Soviet-style industrial planning without having to admit that it’s doing so.

Since the Seventies, when the financial sector began to dominate the U.S. economy, it has become even harder to distinguish between public and private. This owes much to the way private corporations have come to operate. Consider the maze of rules one must navigate if something goes even slightly awry with a bank account. The regulations behind such rules were almost certainly composed by aides to legislators on some congressional banking committee along with lobbyists and attorneys employed by the banks themselves, in a process greased by generous contributions from the banks to those very legislators. The same story is behind credit ratings, insurance premiums, mortgage applications, and even the process of buying an airline ticket. The vast majority of the paperwork we do exists in just this sort of in-between zone: though ostensibly private, it adheres to a legal framework and mode of enforcement that is shaped entirely by a government that works closely with private concerns to ensure that the results will guarantee a certain rate of private profit.

The language we use to talk about this state of affairs — derived as it is from the right-wing critique of bureaucracy — tells us nothing about what is actually going on. Consider the word “deregulation.” In today’s political discourse, deregulation — like “reform” — is almost invariably treated as a good thing. Deregulation means less bureaucratic meddling, and fewer rules and regulations to stifle innovation and commerce. This ideologically inflected usage puts those on the left in an awkward position, since opposing deregulation seems to imply a desire for more rules and regulations, and therefore more men in gray suits standing in the way of freedom.

But this debate is based on false premises. There’s no such thing, for example, as an unregulated bank. Banks are institutions to which the government has granted the right to issue I.O.U.’s that it will recognize as legal tender. The government regulates everything from a bank’s reserve requirements to its hours of operation; how much can be charged in interest, fees, and penalties; what sort of security precautions can or must be employed; how its records must be kept and reported; how and when clients must be informed of their rights and responsibilities; and pretty much everything else.

So what are people referring to when they talk about deregulation? In ordinary usage, the word seems to mean “changing the regulatory structure in a way that I like.” In the case of banking, deregulation has usually meant moving away from a situation of managed competition between midsize firms to one in which a handful of financial conglomerates are allowed to completely dominate the market. In the case of airlines and telecommunication firms in the Seventies and Eighties, deregulation meant the opposite: changing the system of regulation from one that encouraged a few large firms to one that fostered carefully supervised competition between midsize firms. In neither of these cases was bureaucracy reduced.

I’m going to give this process — the gradual fusion of public and private power, which becomes rife with rules and regulations whose ultimate purpose is to extract wealth in the form of profits — a name: “total bureaucratization.” The process is a result of the growing power of financial institutions over a deeply bureaucratized postwar America, and it defines the age we live in. It came about thanks to a shift in the class allegiances of the managerial staff of major corporations: from an uneasy, de facto alliance with workers to one with investors. As John Kenneth Galbraith long ago pointed out, if you create an organization to produce perfumes, milk, or aircraft fuselages, those who belong to that organization will tend to concentrate on improving the process and its product, rather than thinking primarily of what will make the most money for shareholders. Since for most of the twentieth century a job in a large bureaucratic firm meant a lifetime promise of employment, everyone involved in the process tended to share a certain common interest.

What began to happen in the Seventies, which paved the way for what we see today, was a strategic turn, as the upper echelons of U.S. corporate bureaucracy moved away from workers and toward shareholders. There was a double movement: corporate management became more financialized and the financial sector became more corporatized, with investment banks and hedge funds largely replacing individual investors. As a result, the investor class and the executive class became almost indistinguishable. By the Nineties, lifetime employment, even for white-collar workers, had become a thing of the past. When corporations needed loyalty, they increasingly secured it by paying their employees in stock options.

At the same time, everyone was encouraged to look at the world through the eyes of an investor — which is one reason why, in the Eighties, newspapers continued laying off their labor reporters, while ordinary TV news reports began featuring stock-quote crawls at the bottom of the screen. By participating in personal-retirement and investment funds, the argument went, everyone would come to own a piece of capitalism. In reality, the magic circle only widened to include higher-paid professionals and corporate bureaucrats. Still, the perceived extension was extremely important. No political revolution (for that’s what this was) can succeed without allies, and bringing along the middle class — and, crucially, convincing them that they had a stake in finance-driven capitalism — was critical.

With this shift came a broader cultural transformation. Bureaucratic techniques developed in financial and corporate circles (performance reviews, focus groups, time-allocation surveys, and so on) spread throughout the rest of society, to education, science, and government. One can trace the process by following its language. There is a peculiar idiom that first emerged in corporate circles, full of bright, empty terms like “vision,” “quality,” “stakeholder,” “leadership,” “excellence,” “innovation,” “strategic goals,” and “best practices.” Much of it originated from “self-actualization” movements like Mind Dynamics, Lifespring, and est, which were extremely popular in corporate boardrooms in the Seventies. But it quickly became a language unto itself, engulfing any meeting where any number of people gather to discuss the allocation of any kind of resources.

For all that it celebrates markets and individual initiative, this alliance of government and finance often produces results that bear a striking resemblance to the worst excesses of bureaucratization in the Soviet Union, or in former colonial backwaters of Africa and South America where official credentials like certificates, licenses, and diplomas are often regarded as a kind of material fetish — magical objects conveying power in their own right. Since the Eighties, the most “advanced” economies, including the United States, have seen an explosion of credentialism. As Sarah Kendzior, an anthropologist, puts it, “Credentials function as de facto permission to speak, rendering those who lack them less likely to be employed and less able to afford to stay in their field.”

This plays out in field after field, from journalism to nursing, physical therapy to foreign-policy consulting. Endeavors that used to be considered an art, best learned through doing, like writing or painting, now require formal professional training and certification. While these measures are touted — like all bureaucratic measures — as a way of creating fair, impersonal mechanisms in fields previously dominated by insider knowledge and social connections, they often have the opposite effect. As anyone who has been to graduate school knows, it’s precisely the children of the professional-managerial classes, those whose family resources make them the least in need of financial support, who best know how to navigate the paperwork required to get this support. For everyone else, the main result of years of professional training is an enormous burden of student debt, which requires its holders to bureaucratize ever-increasing dimensions of their own lives, and to manage themselves as if they were each a tiny corporation.

Sociologists since Weber have often noted that one of the defining features of a bureaucracy is that its employees are selected by formal criteria — most often some kind of written test — but everyone knows how compromised the idea of bureaucracy as a meritocratic system is. The first criterion of loyalty to any organization is therefore complicity. Career advancement is not based on merit but on a willingness to play along with the fiction that career advancement is based on merit, or with the fiction that rules and regulations apply to everyone equally, when in fact they are often deployed as an instrument of arbitrary personal power.

This is how bureaucracies have always tended to work, though for most of history, this fact has only been important for those who actually operated within administrative systems. Other people encountered organizations only rarely, when it came time to register their fields and cattle with the tax authorities. But the explosion of bureaucracy over the past two centuries — and its acceleration over the past forty years — means that our involvement in this bureaucratic existence has intensified. As whole societies have come to represent themselves as giant credentialized meritocracies, rather than as systems of predatory extraction, we bustle about, trying to curry favor by pretending we actually believe it to be true.


More from

| View All Issues |

January 2007

Close
“An unexpectedly excellent magazine that stands out amid a homogenized media landscape.” —the New York Times
Subscribe now

Debug