When Bill de Blasio won the New York City mayoral election with 73 percent of the vote, I couldn’t help but wonder whether it was the start of something big. For more than two decades it has been taken as an article of faith that New York City voters prefer social liberals who will nonetheless protect the interests of wealthy Manhattanites. But throughout the campaign, de Blasio focused relentlessly on the city’s problems with economic inequality. He proposed raising taxes on high earners to pay for universal pre-K education. He promised to force luxury real estate developers to build affordable housing. He called for an immediate end to the NYPD’s “stop-and-frisk” policy. And he won in a landslide, with the largest margin of victory of any non-incumbent in the city’s history.*
Voters in New York rejected the supreme pragmatism of Mayor Michael Bloomberg — and his insensitivity to how most people live. Manhattan, much of Brooklyn, and, increasingly, sections of Queens are now places for the relatively rich, and to Bloomberg that is good news. On his way out of office, he made the remarkable statement that the more Russian billionaires moved to New York the better, because they provide the tax base necessary for city services. But property taxes, not income taxes, are the main revenue generator in New York. Sales-tax revenue is also critical. Far better to have more jobs — allowing more people to buy homes and spend on goods and services — than to attract a handful of new billionaires, who spend proportionally less of their income than do the middle-class, and who are less likely to spend it locally.
In the Democratic mayoral primary, voters were presented with a continuation of Bloomberg’s billionaire-friendly policies in the form of city-council speaker and Bloomberg heir apparent Christine Quinn, whose nomination was widely seen as inevitable. Instead they chose de Blasio’s unapologetic progressivism. They were presented with a similar choice in the general election, where de Blasio faced a former Rudy Giuliani deputy, Joseph J. Lhota, with strong ties to Wall Street. This time they chose de Blasio by the record margin mentioned above.
As it happens, the common assumption about the New York City electorate — that it’s allergic to Republican positions on race and gender and sexuality but basically conservative on fiscal matters — has come to be assumed for the United States as a whole. If Bill de Blasio’s success challenges this view of New Yorkers, perhaps it has larger implications for progressives around the country. Election Day seemed to bear this out. In Seattle, an avowed socialist won a city-council seat. There were also liberal victories in Minneapolis and Boston. At the state level, the past year saw significant progressive legislative victories, especially with respect to wages. Five states raised the minimum wage in 2013, with California leading the way. By 2016, California — the eighth-largest economy in the world — will have a minimum wage of $10.00 an hour. In last year’s State of the Union address, President Obama proposed raising the federal minimum wage from $7.25 to $9.00. By the end of the year, he was on board with a congressional proposal to raise it to $10.10.
All of this has been read as a sudden shift in America’s political winds. But it could simply be that candidates for office, including most Democrats, have for decades been ignoring Americans’ real needs and economic fears. And now a few are at last speaking out. The U.S. economy has been failing the American people since the late 1970s. What has taken so long?
America’s biggest policy problem today is obsession with the size of the federal deficit. That deficit is well under control for the next ten years — likely to fall to 2 percent of GDP by 2015 — but you wouldn’t know that from listening to most legislators. Nor would you know it from listening to the media outlets so often decried for their “liberal bias.” When Democratic and Republican House members recently came together in a rare moment of common sense to discuss addressing the disastrous budget sequester, the Washington Post’s front-page coverage lamented the fact that both parties were “abandoning their debt-reduction goals.”
Democrats have had at least as much to do with creating the deficit straw man as have Republicans. With the help of prominent economists, they began focusing attention on red ink back in the 1980s. At the time, the deficit was Ronald Reagan’s, so Democrats saw an electoral advantage in the rhetoric of fiscal responsibility. Smaller government and less spending became the new watchwords of many in the party, striking fear into the hearts of those of their colleagues who still advocated generous social spending.
This was only part of the reversal of faith in the New Deal, the Great Society, and an earlier progressive era. In the 1970s, Jimmy Carter was a firm believer in price deregulation, at first for airlines and trucking, then for finance. He appointed Paul Volcker to chair the Federal Reserve, and Volcker’s anti-inflationary shock therapy drove the nation into deep recession. Democrats controlled the House of Representatives when Reagan implemented his income- and capital-gains-tax cuts, and Congress supported cuts in nondefense discretionary spending significantly throughout the 1980s, even as it let spending on defense rise. Over that decade, discretionary spending excluding the military was cut from 4.3 percent of GDP to 3.4 percent.
George H. W. Bush broke his famous “no new taxes” promise in an effort to reduce the enormous debt he inherited from Reagan. Although this decision helped secure Bill Clinton the White House, Clinton’s great political feat was a significant tax increase on high-income individuals. But because the economy did not bounce back right away, the subsequent Republican midterm triumph in 1994 was interpreted by Clinton’s team as proof that the electorate had gone antigovernment and conservative, vastly changed from the 1960s that gave us LBJ’s war on poverty and the Voting Rights Act.
Clinton abided by what his pollsters claimed was the new America, proclaiming the end of “the era of big government” in his 1996 State of the Union address. Domestically, he turned most of his attention to the budget. The powerful Democratic Leadership Council, which Clinton had helped found in the mid-1980s, urged Democratic lawmakers to think twice where tax increases and new social programs were concerned. The DLC supported the practice of “PAYGO,” introduced as part of the 1990 Budget Enforcement Act, which dictated that no new programs should be enacted without the creation of a new revenue source to finance them. This meant there could be few social initiatives without new taxes. The Republicans had won strategically. America was stuck with minimal improvements to the safety net and minimal new funds for transportation, education, and research. Even as Clinton generated surpluses, he resisted big increases in public investment.
To meet his social goals, Clinton generally resorted to tax credits, despite a reduction in military spending made possible by the end of the Cold War. He adopted or expanded tax-advantaged programs to fund health insurance, retirement savings, and affordable college education. This approach was part of the greater reliance on free markets advocated by the DLC. It was social policy without outright expenditure. Most important, Clinton expanded the earned-income tax credit, his most effective program to combat poverty in the age of austerity. (It was based on an idea of Milton Friedman’s.) “Market incentives” became the new buzz phrase among middle-of-the-road liberal economists.
Some reforming of social programs was certainly necessary, and there was a place for the markets in enacting these reforms. But the new focus did not represent the return of clear-eyed pragmatism that it promised. Quite the opposite: it was an ideological turning point that moved the nation toward its antigovernment faith. And Democrats were at the center of it. “We know government doesn’t have all the answers,” Clinton said in his 1996 State of the Union address. But no one had ever claimed that government had all the answers. And of course he failed to add that the market doesn’t have all the answers, either.
“Political debates take place within a universe that is shaped by the development of new ideas,’’ Larry Summers, Clinton’s Treasury secretary and a former Keynesian, told an interviewer after he left office.
Of those new ideas, none is more important than the rediscovery of Adam Smith and the idea that a decentralized system relying on price signals collects information and provides much more insurance than any kind of centrally planned or directed type of system.
Who was there to get angry that inequality was rising, wages were stagnating, and Wall Street was remarkably corrupt? The 1990s saw countless accounting scandals, broker-banker conflicts of interest, under-the-table payments for new issues, and tax loopholes galore. Orthodox economists, meanwhile, said it was the best of all worlds. They called it the Great Moderation. Yes, GDP was less volatile, but speculative bubbles and financial crises were rife.
Meanwhile, there has been ample evidence that Adam Smith has his limits. A mainstay of capitalist economic theory, to take one example, is that wages and productivity — that is, economic output per worker — should rise together. But average wages for those whom the government calls “production and non-supervisory workers” — the nonmanagerial class — have risen only 3 percent since 1979, while labor productivity has risen 90 percent. This disparity was obvious as far back as the Clinton years, but no one in the administration acknowledged the problem.
Or consider the hot debate on the minimum wage. Traditional economic theory would argue, according to the invisible hand, that a rise in wages always means fewer jobs. The assumptions supporting the invisible hand are nothing short of monumental, however. They presume, for example, that workers are almost always being paid fairly for what they contribute to the economy. Fortunately, some good economists rely on more than theory, and look also at case studies. What most researchers have found is that in the real world, past hikes in the minimum wage usually did not lead to lost jobs. In fact, we got higher wages and more local demand, supporting growth and job creation. To be fair, there were empirical studies using different methodologies that showed lost jobs, but even in these cases the losses were modest at worst. A survey of prominent mainstream economists found that a large majority believe an increase in the minimum wage from $7.25 to, say, $9.00 would produce more benefits than costs.
Over time, the economic facts were harder to deny. The crash of 2008 caught almost everyone off guard. The people got angry as hell at the bank bailout, however necessary it might have been. The establishment closed ranks and no financial executives were charged with crimes. Only at the end of 2013 did the economy begin to produce solid growth. The unemployment rate fell to 7 percent, a decline due at last to jobs being created rather than job seekers giving up the search, as happened often in the past few years. (Had those who gave up kept looking, the official unemployment rate would have been more than two percentage points higher — or well more than 9 percent today.) Wages should rise on average in an economic recovery, but they had been on average falling until 2013. These days, median household income is lower after inflation than it was in 1999. Largely for this reason, many non-economists believe we are still in a recession. Two thirds of Americans think the country is on the wrong track.
Why did it take so long for Democrats to rally? Policymakers, economists, and pundits who fear telling the truth because it is unpalatable have been a big part of the problem.
Something new is assuredly in the air, however. If Bill de Blasio can win over more than three quarters of the electorate in America’s financial capital, surely there is a place for brave politicians throughout the country to change the terms of the debate.