Get Access to Print and Digital for $23.99 per year.
Subscribe for Full Access
January 2014 Issue [The Anti-Economist]

The Digital Revolution That Wasn’t

For as long as there have been machines, there have been worries about their power to destroy jobs. The Luddites — early-nineteenth-century artisans who bitterly resisted the new textile machinery that was making them obsolete — are only the most famous example of workers who fought the mechanization of labor. A similar fight has played out many times since the beginning of the Industrial Revolution, over water mills and steam engines, electricity and the assembly line.

But no one today would wish away these technologies. That is because, over time, economies kept growing, jobs were created, wages rose, and prosperity was shared throughout the population. In the wake of the Industrial Revolution, human rights were increasingly — if never adequately — protected, democracy became more firmly rooted, and new inventions, from the automobile to the television, made daily life more pleasant for the vast majority of American workers. There is a simple economic explanation for why technologies that disrupt the labor force in one area tend eventually to benefit everyone: productivity. Increases in the amount of goods made or services delivered per hour of work generally lead to greater prosperity. Some jobs are eliminated, but more and better-paying jobs often replace them.

It should be noted that this trend requires government support through wage regulations, the protection of labor unions, and a social safety net of unemployment insurance, employment tax credits, and retirement and health-care programs for the elderly. But on balance, the Industrial Revolution was a natural experiment that proved one of Adam Smith’s central points: Rising productivity is the source of the wealth of nations. The founding of the United States coincided almost exactly with the start of the Industrial Revolution, so we are a nation that has always experienced this rising productivity. While we reached a true golden age in the 1950s and 1960s, when both productivity and wages rose unusually rapidly and unemployment remained low, increasing prosperity has been the norm for most of American history.

In the 1970s, however, productivity growth slowed down suddenly and significantly — and it did so just as the use of the computer began to sweep through business. Surely this invention, which so many have compared to the greatest breakthroughs of the eighteenth and nineteenth centuries, should have brought with it a surge in productivity. But this never came. After years of disappointment — in the words of the economist Robert Solow, the computer was everywhere but in the productivity numbers — the development of the Internet became the next great electronic hope. If individual computers had failed to deliver the expected gains, certainly networking them would.

The Internet caused the same short-term job disruption that all major technologies do. Amazon destroyed bookstores and other brick-and-mortar retailers, newspapers began to fold, and the offshoring of various routine jobs became much easier. But free-market advocates assured us that in the long term the lost jobs would be made up elsewhere, as they had in countless similar situations since the 1770s. In 1997, The Economist claimed that the “communications revolution” would produce “a change even more far-reaching than the harnessing of electrical power a century ago.” Businessweek, one of the most reliable outlets for New Economy boosterism, called the computer “a transcendent technology — like railroads in the 19th century and automobiles in the 20th. . . . Forget 2% real growth. We’re talking about 3%, or even 4%.”

For a little while, it looked possible. Productivity rose at about 3 percent annually between 1996 and 2005. In the late 1990s, hardware and software companies such as Intel and Microsoft helped add millions of jobs to the economy. Like the Ford Motor Company before them, these companies paid good wages. Even Walmart and other big-box retailers responded to productivity increases in the late 1990s by adding jobs. By 1999, the unemployment rate had fallen to just over 4 percent, its lowest point since the Sixties. But the boom subsided far sooner than New Economy advocates anticipated. Productivity growth had begun to fall by 2005, well before the onset of the Great Recession. Since then, it has averaged a mere 1.5 percent annually — a similar level as in the disappointing period from the early Seventies to the mid-Nineties — and it grew at closer to half a percent in 2011 and 2012. Productivity has grown at the average golden-age rate for eleven of the past forty years.

If we examine the data more closely, we can see how little new technologies have helped the United States return to the rapid growth that made it so rich. “Computers and related electronics, the rest of manufacturing, and information sectors have contributed around half of overall productivity growth since the turn of the century,” notes a recent report from the consulting group McKinsey & Company, “but reduced employment by 4.5 million jobs.” Walmart and copycat retailers also started shedding jobs while continuing to offer workers miserable wages and benefits. Outsourcing enabled manufacturers to reduce the domestic workforce and put pressure on the wages of those who remained. The result was that the economic expansion under George W. Bush from 2001 to 2007 saw the weakest growth in employment figures of any expansion since World War II.

The newly dominant information companies are not job producers. In 1955, General Motors employed nearly 600,000 people. Today, in a much larger economy, Google employs fewer than 50,000; eBay employs about 20,000 people in the United States; Facebook, fewer than 6,000. The numbers for Apple, Microsoft, and Amazon — which each employ between 80,000 and 100,000 worldwide — are better, but still small potatoes compared with General Electric or Ford. By one calculation, the development of the iPod created fewer than 14,000 U.S. jobs. Meanwhile, Americans work harder, equipped with devices that allow them to check in while at home, commuting, or on vacation. (We may actually be undercounting hours worked, which would mean productivity is even lower than currently estimated.)

Given the long-standing precedents, it is tempting to explain away the failure of the information age to provide more prosperity. Some argue that our sluggish recovery from the current economic crisis is to blame, but as I’ve already mentioned, the slowdown in productivity growth preceded the recession. Others say that the true output of the new technologies just doesn’t show up in GDP, since the daily utility of a cell phone or the pleasure of playing a video game is not easily counted — but neither were many of the pleasures derived from cars, transistor radios, or televisions, and measured productivity kept rising after the introduction of these devices.

Still others point to one area of the economy where innovation did bring greater productivity of a sort: the financial sector. As a proportion of the economy, finance has roughly doubled in size since 1980, and its productivity is high; and indeed, the amount of GDP contributed by finance served to exaggerate productivity figures in the early 2000s. Financial products were overpriced and oversold, contributing little to the nation’s welfare and a lot to bankers’ pocketbooks. Given how much damage they’ve caused, a large portion of Wall Street sales might be best characterized as negative GDP. The risky derivatives products bankers sold were really time bombs. The credit-default swaps designed as insurance against a downturn were never paid off. Now, finance may be coming back down to earth. Cheap, often fraudulent mortgages drove the economy of the 2000s more than true innovation in financial products.

If we set these excuses aside, how do we explain the apparent historical anomaly of technological progress without production gains? An increasing number of economists, Northwestern University’s Robert Gordon among them, argue that this most recent wave of technological innovation is simply not comparable in terms of productivity potential to those that preceded it. “The rapid progress made over the past 250 years,” Gordon writes, “could well turn out to be a unique episode in human history.” He concludes that the so-called third industrial revolution is unlikely to muster the gains in productivity brought about by steam power and railroads and then by electricity and the internal combustion engine.

Of course, no one can plausibly forecast the course of innovation. Famed economists such as Harvard’s Alvin Hansen (1887–1975) claimed as far back as the Great Depression that technology had run its course. I’d guess odds are high that major innovations are yet to come, even if research investment declines, simply because there are so many trained engineers, scientists, and entrepreneurial business leaders, and so many ideas to build on. Microchips could get radically cheaper still. The potential of nanotechnology may pay off at last. Breakthroughs in various fields could relieve the economic costs of reducing pollution.

In the meantime, the government could be doing much to make us a more productive society. Health care is one of the nation’s biggest industries, and it is delivered less efficiently here than almost anywhere else in the world. Even if it meets its stated aims, Obamacare should be only the beginning of reform. We need greater efforts to deliver health care to all — not just for the sake of fairness but also for that of our economy: jobs in the industry are difficult to move offshore, and a population that stays healthier longer is a major boon to productivity.

Education is another field in which the government can make a difference and jobs tend to stay at home. Better pay for teachers and administrators could go a long way toward improving elementary and high school education. Making higher education more affordable would free younger workers from the burden of debt.

Personal computers, cell phones, and the Internet have so radically changed our day-to-day lives that it’s natural to frame them as developments on par with the steam engine or indoor plumbing. But we’re four decades into the digital age, and the economic benefits have been paltry at best. Robert Gordon and others think our future rests mostly with narrowly defined technological advances from the private sector. He may be right, or another culture-changing innovation may be right around the corner. Either way, America’s productivity slowdown is cause for alarm, and there’s much we could be doing about it besides waiting around for the next big thing.

undefined

| View All Issues |

January 2014

Close
“An unexpectedly excellent magazine that stands out amid a homogenized media landscape.” —the New York Times
Subscribe now

Debug