Get Access to Print and Digital for $23.99 per year..
Subscribe for Full Access

Last month, Oliver Stone’s Untold History of the United States, the documentary series the director made with historian Peter Kuznick, completed its ten-episode run on Showtime. Those who unthinkingly dismiss Stone as a conspiracy theorist might be surprised by this scrupulously accurate warts-and-all consideration of the American century, which complicates our self-serving national folklore. More than thirty years after its publication, Howard Zinn’s A People’s History of the United States has attained the status of a classic, but we could use more efforts in this direction. “There is no such thing as impartial history,” Zinn acknowledged, but revisions like his and Stone’s provide a necessary corrective to the ignorance and whitewashing that often proceed from national embarrassments.

Unfortunately, we almost entirely lack similarly comprehensive correctives to prevailing economic history. It has been a century since the publication of Charles Beard’s famous An Economic Interpretation of the Constitution of the United States, which argued that the Founding Fathers designed the Constitution partly to serve their own material interests. Since then, romantic and ideological myths about how America became prosperous have too often gone unchallenged. This is more than an academic concern: these misapprehensions continue to drive poor policymaking today.

What myths would a good alternative economic history debunk? The first might be that America owed its rapid economic growth in the nineteenth century to the small size of its federal government. This widely accepted narrative neglects the many regulatory and legal reforms that went into effect in those years, reforms guaranteeing fair competition in business and the right of ordinary people to buy land. The myth also ignores state-financed investments in canals and railroads; the development of free primary (and, later, high school) education, paid for with taxes; and the building of sewers and water-sanitation systems in cities, which controlled disease. It overlooks high tariffs, imposed by the federal government, that spurred the development of manufacturing. This is not to mention the economic benefits of slave labor (hardly a laissez-faire arrangement), which required government enforcement. For good or for ill, intervention by the state into economic life was a reality of the time.

There have been similarly damaging misinterpretations and distortions of our economic boom in the years following World War II. What caused economic growth in America during this time, when our federal government was undeniably large and active?

Tyler Cowen, a respected conservative economics professor who writes frequent columns for the New York Times, attempts to answer this question in The Great Stagnation, a small book — really a pamphlet — that made the Times bestseller list in 2011. It’s a silly, trivial book, but a representative one. According to Cowen, America’s postwar prosperity can be reduced to a few simple principles: plentiful natural resources in the early years, technological advances, and an increasingly educated population.

Is this adequate history? Howard Zinn wrote that the “chief problem in historical honesty isn’t outright lying” but the “omission or de-emphasis of important data.” In the case of America’s two and a half postwar decades, Cowen is right to note the importance of natural resources — but a balanced economic history would also note that the price of petroleum, the most important natural resource at that time, was kept artificially low. Oil consumption tripled in the 1950s and 1960s, yet prices fell because of diplomatic machinations and threats of force that maintained U.S. access to growing reserves in the Middle East. Government made the difference.

Cowen is also quite right to note the key role technological advances played in our postwar boom, but these advances, too, were often driven by government stimulus. Cowen scoffs at the “nostalgia” of such economists as Paul Krugman who advocate Keynesian spending as a way to return to full employment, but Keynesian policies drove growth in the postwar era. The Cold War and the growing political power of defense companies — Eisenhower’s “military–industrial complex” — kept defense spending at 12 percent of gross domestic product in the 1950s and 10 percent in the 1960s. Those decades also saw the implementation of enormous government-sponsored research and development projects, the building of the interstate highways, and the continuation of the G.I. Bill and other forms of tuition subsidies, all of which contributed to growth. Today, even during a major war, military spending is around 5 percent of GDP. I point this out not to argue for more military spending but to suggest how much we could afford to be investing in education, research and development, communications, and transportation infrastructure.

Cowen barely acknowledges government’s role in education, but he does at least recommend more federal subsidies of research. Central to his claims about the futility of most other government policies to enhance growth today is the assertion that such advances as the combustion engine, the commercial jet, the nation’s highways, and the Internet have been fully exploited. The “low-hanging fruit,” as Cowen terms it, has all been picked.

But none of these advances looked like easy picking before they were developed, and Cowen has no empirical reason to believe that the march of technology has ended. He is simply making a stab in the dark to justify his preferred policy of minimal government action. To this end, Cowen either doesn’t know or chooses not to tell us that the “low-hanging fruit” hypothesis was also used in the 1930s to explain the Great Depression. Alvin Hansen, a prominent Harvard economist, was the leading advocate in that era of basically the same stagnationist thesis Cowen presents today. But as we know, and as the post–World War II boom showed, he was wrong. War spending provided the Keynesian stimulus that turned the Depression’s stagnation into rapid growth. Once the economy recovered, there proved to be plenty more low-hanging fruit to be plucked.

Another economic myth is that the debilitating inflation of the 1970s was caused by loose monetary policy — that is, by the Federal Reserve’s creating too much money through the imposition of very low interest rates. Annualized inflation reached as high as 11.2 percent in that decade. High inflation in turn pushed up the rates that lenders demanded to compensate for the reduced value of money in the future. Interest rates on mortgages, for example, peaked at around 18 percent.

Milton Friedman was foremost among those who blamed inflation on the loose monetary policy pursued by Jimmy Carter’s Fed. Friedman’s view has since become conventional history. But the belief is mostly wrong. More likely culprits — as Princeton economist and former Federal Reserve vice chairman Alan Blinder persuasively argued as early as 1982 — were two years’ worth of major crop failures, which led to soaring food prices, and the Arab nations’ oil-price hike, which quadrupled prices, from roughly $3 a barrel to $12 by 1974.

This contrarian take was angrily challenged, but in 2008, with Federal Reserve economist Jeremy Rudd, Blinder ran the data again. There were revisions to be accounted for. They found even stronger support for the original claim that food and oil “supply shocks,” not Fed policy, had accounted for the majority of inflation. Why does this argument matter? Because if Fed policy is the cause of high inflation, then the solution is to tighten the money supply and raise interest rates — even if that means a punishing recession. In other words, throw people out of work and cut spending on goods and services. And this is just what the Fed subsequently did under Paul Volcker. The truth, I conclude from Blinder’s work, is that inflation would have gradually subsided even without harsh monetary policy — that is, without a recession.

The Friedman mythology is also the source of another related historical fantasy: the heroism of Paul Volcker. Late in his presidency, Carter chose Volcker to replace G. William Miller as Fed chairman. Volcker soon reversed Miller’s dovish monetary policy, eventually raising the Fed target rate — the point at which the central bank seeks to set short-term borrowing rates between banks — to a high of 20 percent in 1981, 10 percent above where it stood when he took charge in 1979. Other borrowing rates rose accordingly. The economy went into free fall, and unemployment soared to nearly 11 percent in 1982. It was the worst recession since the Great Depression. Not only do economists give Volcker a pass on the severe recession, but he is among the most praised policymakers of our time.

For the subsequent twenty-five years, keeping inflation low — to about 3 percent a year — became virtually the only economic goal of Washington policymakers. Yet there is no empirical evidence to suggest that higher inflation rates would reduce economic growth. One fine report, written by several economists, including the Nobel laureate George Akerlof, showed that a higher inflation target throughout this period would have been entirely acceptable, even desirable.

Perversely, this obsession with inflation made the real aim of monetary policy keeping the unemployment rate high. If one tries to lower unemployment below its “natural rate,” said Friedman and his followers, inflation will follow. It turned out to be one of the great hoaxes of the time. In the 1980s, even liberal economists claimed the so-called natural rate was as high as 6 percent. But by 2000, the unemployment rate fell below 4 percent, with no sign of rising inflation. Economists didn’t backtrack; instead they decided that the once inviolable natural rate simply changed all the time, and they set about figuring how to predict these variations. They haven’t succeeded yet.

One last major historical myth in need of debunking: Ronald Reagan, an economic hero to so many, did not revitalize the economy through sharp tax cuts and deregulation. To the contrary, his policies badly failed. Under Reagan, income inequality began to rise rapidly after three decades of solid increases in earnings across all income levels. The top 10 percent of earners received 92 percent of all new income during this period, compared with 35 percent of the rise in income from 1960 to 1969.

Wages grew far faster after Clinton’s tax hike than after the implementation of Reaganomics. And as for revitalizing productivity, output per hour of work — the key measure of the economy’s ability to produce income — rose far more slowly under Reagan than it did before or after.

The assumption that technology, natural resources, and a more educated workforce are the only important sources of growth leads us to believe that Keynesian policies are not valuable. The assumption that inflation has been caused by the Federal Reserve’s effort to keep unemployment down leads to policies that keep interest rates high, unemployment up, and wages low. Romanticizing the achievements of Ronald Reagan leads many to believe that tax cuts inevitably generate rapid economic growth.

Most damaging, however, are views of history, like Cowen’s and many other economists’, that fail to recognize the central place of government policy and intervention in economic growth. One tragic result of such views was the extreme financial deregulation and weak implementation of existing regulation that led directly to the financial crisis and the Great Recession. Another result is the weakness of current policies aimed at supporting the recovery — in particular, the failure to adopt a second fiscal stimulus back in 2009, and current austerity policies that will further diminish growth and perhaps even lead to a recession this year.


More from

| View All Issues |

February 2014

“An unexpectedly excellent magazine that stands out amid a homogenized media landscape.” —the New York Times
Subscribe now