Get Access to Print and Digital for $23.99 per year.
Subscribe for Full Access
Adjust
The rise of the internet and a new age of authoritarianism

“The Goliath of totalitarianism will be brought down by the David of the microchip,” Ronald Reagan said in 1989. He was speaking to a thousand British notables in London’s historic Guildhall, several months before the fall of the Berlin Wall. Reagan proclaimed that the world was on the precipice of “a new era in human history,” one that would bring “peace and freedom for all.” Communism was crumbling, just as fascism had before it. Liberal democracies would soon encircle the globe, thanks to the innovations of Silicon Valley. “I believe,” he said, “that more than armies, more than diplomacy, more than the best intentions of democratic nations, the communications revolution will be the greatest force for the advancement of human freedom the world has ever seen.”

At the time, most everyone thought Reagan was right. The twentieth century had been dominated by media that delivered the same material to millions of people at the same time—radio and newspapers, movies and television. These were the kinds of one-to-many, top-down mass media that Orwell’s Big Brother had used to stay in power. Now, however, Americans were catching sight of the internet. They believed that it would do what earlier media could not: it would allow people to speak for themselves, directly to one another, around the world. “True personalization is now upon us,” wrote MIT professor Nicholas Negroponte in his 1995 bestseller Being Digital. Corporations, industries, and even whole nations would soon be transformed as centralized authorities were demolished. Hierarchies would dissolve and peer-to-peer collaborations would take their place. “Like a force of nature,” wrote Negroponte, “the digital age cannot be denied or stopped.”

One of the deepest ironies of our current situation is that the modes of communication that enable today’s authoritarians were first dreamed up to defeat them. The same technologies that were meant to level the political playing field have brought troll farms and Russian bots to corrupt our elections. The same platforms of self-expression that we thought would let us empathize with one another and build a more harmonious society have been co-opted by figures such as Milo Yiannopoulos and, for that matter, Donald Trump, to turn white supremacy into a topic of dinner-­table conversation. And the same networked methods of organizing that so many thought would bring down malevolent states have not only failed to do so—think of the Arab Spring—but have instead empowered autocrats to more closely monitor protest and dissent.

If we’re going to resist the rise of despotism, we need to understand how this happened and why we didn’t see it coming. We especially need to grapple with the fact that today’s right wing has taken advantage of a decades-long liberal effort to decentralize our media. That effort began at the start of the Second World War, came down to us through the counterculture of the 1960s, and flourishes today in the high-tech hothouse of Silicon Valley. It is animated by a deep faith that when engineering replaces politics, the alienation of mass society and the threat of totalitarianism will melt away. As Trump fumes on Twitter, and Facebook posts are linked to genocide in Myanmar, we are beginning to see just how misplaced that faith has been. Even as they grant us the power to communicate with others around the globe, our social-­media networks have spawned a new form of authoritarianism.

Illustrations by Lincoln Agnew

The political vision that brought us to this point emerged in the 1930s, as a response to fascism. In the years before the Second World War, Americans were mystified as to how Germany, one of the most sophisticated nations in Europe, had tumbled down the dark hole of National Socialism. Today we’d likely blame Hitler’s rise on the economic chaos and political infighting of the Weimar era. But at the time, many blamed mass media. When Hitler spoke to row upon row of Nazi soldiers at torch-lit rallies, the radio broadcast his voice into every German home. When he drove through adoring crowds, standing in his Volks­wagen convertible, giving the Nazi salute, the newsreel cameras were there. In 1933, the New York Times described the predicament of the average German this way:

With coordinated newspaper headlines overpowering him, with radio voices beseeching him, with news reels and feature pictures arousing him, and with politicians and professors philosophizing for him, the individual German has been unable to salvage his identity and has been engulfed in a brown wave. . . . They are living in a Nazi dream and not in the reality of the world.

Toward the end of the decade, President Roosevelt began searching for ways to urge Americans to take a unified stand against fascism. Given the rise of right-wing fervor in the United States at the time, he had reason to worry. The racism and anti-­Semitism that characterized Nazi Germany also characterized much of American life. By 1938, millions of Americans listened weekly as Father Charles Coughlin, a Catholic demagogue, celebrated the rise of fascism and decried the existence of the Jews. Thousands of American fascists banded together in groups with names like the Silver Legion of America and the Crusader White Shirts. The Amerikadeutscher Volksbund, a 25,000-member pro-Nazi organization commonly known as the Bund, ran a summer camp on Long Island called Camp Siegfried, where young men marched in Nazi-style uniforms as their friends and families cheered. On February 20, 1939, the Bund brought more than 22,000 Americans to New York’s Madison Square Garden to welcome fascism to American shores. When they gathered, a huge banner hung over their heads: stop jewish domination of christian americans!

As the United States geared up for war, its leaders faced a quandary: they wanted to use media to unite Americans against their enemies, but many also feared that using mass media to do it would transform Americans into just the kind of authoritarians they were trying to defeat. Roosevelt’s cabinet sought advice from a group of intellectuals calling themselves the Committee for National Morale. The Committee had been founded in the summer of 1940 by a historian of Persian art named Arthur Upham Pope, who brought together a number of America’s leading thinkers, including the anthropologists Margaret Mead and Gregory Bateson, psychologists Gordon Allport and Kurt Lewin, and journalists Edmond Taylor and Ladislas Farago. Over the next two years, they would advise the Roosevelt Administration; produce pamphlets, news articles, and books; and set the cornerstone of our contemporary faith in decentralized media.

The Committee began by defining national morale in terms of what they called the democratic personality. Members of the Committee joined many American intellectuals in subscribing to the views of the anthropologist Franz Boas, who believed that cultures shape the personalities of their members in predictable ways. Germans, they thought, tended toward rigidity and an affection for authority, hence Hitler’s famously bureaucratic Nazi regime was a natural extension of the German character. Americans were more open, individualistic, expressive, collaborative, and tolerant, and so more at home in loose coalitions. Whatever kind of propaganda medium the Committee promoted would need to preserve the individuality of American citizens. Allport summed up the Committee’s vision in a 1942 essay. “In a democracy,” he wrote, “every personality can be a citadel of resistance to tyranny. In the co-ordination of the intelligences and wills of one hundred million ‘whole’ men and women lies the formula for an invincible American morale.

As the Committee sought to coordinate rather than dominate American minds, its members turned to a kind of media system that we might now call a platform: the museum. These days we’re not used to the idea of buildings as media systems. But the Committee thought about museums in the same way many think about virtual reality today—as immersive visual environments where we can increase our empathy for one another. Mead, who was a student of Boas and worked for the American Museum of Natural History in New York, pointed out that in a museum, people could walk among images and objects distributed across the walls and around the floor, choosing to pay attention to those that seemed most meaningful to them. They could hone their individual tastes, they could reason about their individual places in the world, and they could do it together.

In 1942, the Museum of Modern Art in New York put the Committee’s vision into practice with a widely heralded propaganda exhibition entitled Road to Victory. Most American art shows at the time featured images of more or less identical sizes hung in a row at eye level, but this one mounted images of all sizes overhead, at the viewer’s feet, and everywhere in between. A path wound through the forest of photographs. The pictures were carefully chosen to spark patriotic fervor, but judging from reviews, it was the manner of their display that captivated the show’s audience. As one critic put it, the show did not seek to “mold” its visitors’ beliefs, “for that word smacks of the Fascist concept of dominating men’s minds.” It simply invited Americans to walk down the road to war, individually unique, yet collectively united. Another reviewer wrote: “It is this inescapable sense of identity—the individual spectator identifying himself with the whole—that makes the event so moving.”

The first electronic computer would not be unveiled until 1946, and the internet was still decades away. Yet the Committee’s vision became central to how we think about computers today when several of its members began to collaborate with a mathematician named Norbert Wiener. In the first years of the war, Wiener and his MIT colleagues were trying to design a more accurate antiaircraft defense system. Antiaircraft gunners would only be able to shoot down enemy planes reliably if they could predict where the planes would be when the gunners’ shells reached the sky. At the time, there was no way to make that prediction with any certainty, since both gunner and pilot were capable of random movements. Wiener tried to solve this problem by imagining the gunner, the antiaircraft gun, the enemy pilot, and the enemy plane as elements in a single system whose behavior could be represented mathematically.

Wiener’s antiaircraft predictor never worked on the battlefield. But his insight that the behavior of both machines and human beings could be represented through computation became a founding principle of computer science. In 1946, it also became a founding principle of a new political vision. That year, Wiener and members of his scientific community traveled to New York to meet with a group of sociologists and psychologists, Mead and Bateson of the Committee for National Morale prominent among them. Together, the social and laboratory scientists began to outline a vision of a liberal world modeled and managed by computers, a vision that they would develop over the next seven years, and that would become one of the most influential intellectual movements of the twentieth century: cybernetics.

In 1950, Wiener published The Human Use of Human Beings, an enormously popular introduction to the new field that argued that modern society operated through a series of information exchanges, just like the antiaircraft predictor. Reporters and social scientists gathered data; intellectuals, business leaders, and politicians processed it; and, ultimately, the systems they controlled took action. When working properly, such a process would naturally tend toward equilibrium—that is, social order. And computers, Wiener argued, could help improve the flow of information by supplying decision-makers with better data faster. “Fascists, Strong Men in Business, and Government . . . prefer an organization in which all orders come from above, and none return,” wrote Wiener. The solution to totalitarianism, he argued, was to recognize the world as a system of leveled, distributed communication that could be modeled and managed by computers. The proper way to achieve the Committee’s vision and democratize society, his argument implied, was to take power away from politicians and put it in the hands of engineers.

Wiener’s writings fired the imaginations of an unlikely group of young Americans, members of the Sixties counterculture who would go on to have an outsized impact on the computer industry. Between 1965 and 1973, as many as 750,000 Americans left their apartments and suburban houses and created new collective communities. A few of these communes were religious, but most were secular gatherings of white, middle- and upper-middle-class young people seeking to leave mainstream America behind. In northern California, refugees from Haight-Ashbury migrated north, to the woods of Mendocino, and east, to the high plains of Colorado and the mountains of New Mexico. Some even set up shop in the hills around Stanford University, overlooking what we now call Silicon Valley.

Elsewhere I’ve called this generation of pilgrims the “New Communalists” to distinguish them from members of the New Left, with whom they often disagreed. Unlike the young dissidents who formed parties and wrote manifestos, the New Communalists hoped to do away with politics entirely. They wanted to organize their communities around a shared mind-set, a unified consciousness. Many agreed with Charles Reich when he wrote in his 1970 bestseller, The Greening of America, that industrial society offered “a robot life, in which man is deprived of his own being, and he becomes instead a mere role, occupation, or function.” According to Reich, the solution was to cultivate a new consciousness of one’s own desires and needs, of the connections between one’s body, one’s mind, and the natural world. Such a consciousness, he explained, could become the foundation of a new kind of society, one that would be nonhierarchical and collaborative.

Watching this migration take shape was Stewart Brand, a former multimedia artist and sometime member of Ken Kesey’s psychedelic wrecking crew, the Merry Pranksters. In 1968, Brand and his wife Lois drove their aging pickup truck to a string of communes to see what the new settlers needed in the way of tools. That fall, the Brands set up shop in Menlo Park, California, not far from where Facebook’s headquarters stand today, and began to publish a document that quickly became required reading across the counterculture: the Whole Earth Catalog. Despite its name, the Catalog did not actually sell anything. Instead, it collected recommendations for tools that might be useful to people headed back to the land. One of those tools was Norbert Wiener’s first book, Cybernetics. Another was an early and massive Hewlett-Packard calculator.

The New Communalists eschewed what Reich called the “machine world” of tanks and bombs and the industrial bureaucracies that produced them. The rule-bound hierarchies of the corporation and the state, they thought, alienated their members from their own feelings and turned them into the kind of buttoned-­down apparatchiks who could launch a nuclear war. Even so, the New Communalists embraced small technologies that they hoped would help them live as independent citizens within the kind of universe that Wiener and the Committee had described, a universe in which all things were interlinked by information. The Catalog gave readers access to plans for Buckminster Fuller’s geodesic domes and guides to everything from cheap travel to boatbuilding. At a time when it could be difficult to find a commune if you didn’t know someone who lived on one, the Catalog also became a map of the commune world and its concerns. As the first nodes of the internet were being wired together, the Catalog became a paperbound search engine.

The future leaders of Silicon Valley took notice. Steve Jobs, who had spent some time on a commune called All One Farm, would later call the Catalog “one of the bibles of my generation. . . . It was sort of like Google in paperback form, thirty-five years before Google came along.” Alan Kay, whose designs for a graphical user interface would shape several generations of Apple computers, explained that he and his colleagues saw the Catalog as an information system in its own right. In that sense, he said, he “thought of the Whole Earth Catalog as a print version of what the internet was going to be.”

By the mid-1980s, computers were small enough to sit on desks, and individual users were able to type messages to one another in real time. Most of the communes had collapsed, but the computer industry in northern California was growing rapidly, and it welcomed former communards. Brand worked with Larry Brilliant, who would later help launch Google’s philanthropy division, ­Google.org, to design an online discussion system known as the Whole Earth ’Lectronic Link or WELL. On the WELL, users dialed in to a server where they saw messages from other users in threaded conversations. Howard Rheingold, a journalist and early member, believed that the WELL was a melding of the minds, a kind of virtual community. “Personal computers and the PC industry,” he wrote later, “were created by young iconoclasts who had seen the LSD revolution fizzle, the political revolution fail. Computers for the people was the latest battle in the same campaign.”

By the end of the decade, when Reagan hailed the “David of the microchip,” many in Silicon Valley believed they had the tools to create the kind of person-centered democracy that the Committee had envisioned. They would achieve it through open conversation spaces like the WELL, engineered public spheres in which individuals gave voice to their experiences, gathered feedback from their peers, and changed their behavior accordingly. They shared Wiener’s faith in the power of information systems to liberate those who used them, and the Committee’s confidence that, given the chance to express themselves, individuals could create their own social order, without the need for top-down government control. If the mass-media era had brought us Hitler and Stalin, they believed, the internet would bring us back our individuality. Finally, we could do away with hierarchy, bureaucracy, and totalitarianism. Finally, we could just be ourselves, together.

Today, that sense of utopian mission persists throughout Silicon Valley. A month after Trump took office, Mark Zuckerberg laid out his social vision in a Facebook post entitled “Building Global Community.” Though only a few thousand words long, the document is every bit as ambitious as Wiener’s The Human Use of Human Beings. Like Wiener, Zuckerberg envisions a world in which individuals, communities, and nations create an ideal social order through the constant exchange of information—that is, through staying “connected.” “Our greatest opportunities are now global—like spreading prosperity and freedom, promoting peace and understanding, lifting people out of poverty, and accelerating science,” he wrote, sounding much like a representative of the Cold War–era State Department. “In times like these,” he continued, “the most important thing we at Facebook can do is develop the social infrastructure to give people the power to build a global community that works for all of us.”

For Zuckerberg, as for much of the left today, the key to a more egalitarian society lies in the freeing of individual voices, the expression of different lived experiences, and the forming of social groups around shared identities. But Facebook has tried to enable this kind of society by creating privately owned, for-profit digital technologies. As Zuckerberg put it, echoing the goals of the Whole Earth Catalog fifty years before, “Our commitment is to continue improving our tools to give you the power to share your experience.” Engineers like Zuckerberg or, for that matter, Wiener, have little interest in party politics: if you want to change the world, you don’t lobby or vote; you build new technologies.

This view has proved enormously profitable across Silicon Valley. By justifying the belief that for-profit systems are the best way to improve public life, it has helped turn the expression of individual experience into raw material that can be mined, processed, and sold. The big social-media companies, which often began with a dream of making WELL-like virtual communities at scale, have now become radically commercialized and devoted to surveillance at every level. On the WELL, users listened to each other, trying to get a feel for what kinds of people they were and how they might work together. Now user data is optimized and retailed automatically, to advertisers and other media firms, in real time. Computers track conversations and extract patterns at light speed, rendering them profitable. In 2017, Facebook reported annual revenue of more than $40 billion.

Social media’s ability to simultaneously solicit and surveil communication has not only turned the dream of individualized, expressive democracy into a fountain of wealth. It has turned it into the foundation of a new kind of authoritarianism. Fascists used to be distinguished by their penchant for obedience, submission, and self-erasure, with the power of public emotional expression reserved for the dictator. That is why both Wiener and the Committee stressed the qualities of independence and self-awareness in the democratic personality. And it was against the background of fascism that, during and after the 1960s, Vietnam protestors, civil-rights activists, feminists, queer-rights activists, and other members of the myriad communities who drove the rise of identity politics asserted their individual, lived experience as the basis of their right to political power. If the essence of totalitarianism was collective self-effacement, the foundation of democracy would have to be the assertion of collective individuality.

Today, radio and television talk shows, podcasts, blogs, and, of course, social media are part of a new media ecosystem that has rendered the voicing of one’s experiences so easy and powerful as to turn it into an appealing tool for the right as well as the left. Figures such as Richard Spencer, for instance, have adopted the playful, confessional style of online influencers everywhere. Since Spencer coined the euphemistic term in 2008, the “alt-right” has come to shelter white nationalists, anti-Semites, radical misogynists, and neo-Nazis. What holds the movement together in the public eye is its savvy use of social media. Over the past two years, scholars at the Data & Society Research Institute, an independent think tank in New York, have been tracking the rise of the alt-right online. In a series of reports, they have revealed a world in which the kinds of men who chanted “Jews will not replace us!” in Char­lottesville, Virginia, present themselves in the bright, first-person style of online makeup instructors. They aim to be read as whole people—witty, warm, and authentically themselves.

Rebecca Lewis, a Data & Society researcher who is now a ­PhD student at Stanford, has studied sixty-five such right-wing influencers on ­YouTube. Most are masters of microcelebrity. They brand themselves with care, spark attention-getting controversy wherever possible, link to one another’s websites, appear on one another’s ­YouTube shows, and optimize their video feeds for search engines. Despite their intellectual differences, Lewis points out that they have been able to create the impression that they are a unified political force. Their chummy, millennial-­friendly style, she argues, goes a long way toward suggesting that really, you know, anti-Semitism and violent, racist riots are the kind of thing that thinking young people everywhere ought to embrace.

Alt-right figures have consciously modeled their online behavior after the political logic of the 1960s counterculture, and particularly its New Communalist wing. In a 2016 interview with The Atlantic, Spencer could have been channeling an entire generation of commune-builders when he said, “We are really trying to change the world, and we are going to do that by changing consciousness, and by changing how people see the world, and how they see themselves.” The Daily Stormer, a neo-Nazi website, put the project less benignly in a leaked style guide: “One should study the ways that Jews conquered our culture in the 1960s. . . . They created a subculture by infesting certain elements of the existing culture. That is what we aim to do.”

The identity-based movements of the left have been extraordinarily effective at changing American culture, and the alt-right clearly hopes to copy their success. By claiming the mantle of rebellion, the alt-right can take to the streets in protest as if anticolonialism in the classroom were a new Vietnam War. They can argue that their ability to spew hate is in fact a civil right, and that their movement is simply a new version of the Free Speech Movement of 1964. On ­YouTube, they can tell stories of their own conversion to conservatism in an idiom pioneered by gay activists: the coming-out story. Lewis notes that the conservative activist Candace Owens rose to YouTube fame after she posted a humorous video on her channel, Red Pill Black, that revealed her political beliefs to her parents. Owens titled it, “Mom, Dad?. . . I’m a Conservative.” When friends and families find their new politics reprehensible, the converts need not engage. Their storytelling style alone implies that racism and nationalism are in fact just as natural and true as a person’s sexuality.

Pundits on the left are fond of reminding us of how Trump storms and fulminates, the White House itself unable to contain his petulance and rage. Those same pundits then marvel that around 40 percent of the American people still think he is doing a good job. What they fail to understand is that Trump has mastered the politics of authenticity for a new media age. What mainstream analysts see as psychological weakness, Trump’s fans see as the man just being himself. What’s more, his anger, his rants, and his furious narcissism act out the feelings of people who believe they have been dispossessed by immigrants, women, and people of color. Trump is not only true to his own emotions. He is the personification of his supporters’ grievances. He is to his political base what Hitler was to many Germans, or Mussolini to Italians—the living embodiment of the nation.

Here, the identity-centered liberalism that has dominated so much of public life since the Second World War has come full circle. Its victories have been many, from civil rights to legalized abortion and gay marriage, and they have dramatically changed American life for the better. But in the form of people like Trump and Spencer, the performance of individualism—the revelation of the whole person in the context of public debate that was meant for so long to be a bulwark against totalitarianism—has also allowed today’s authoritarians to claim a new legitimacy. Fifty years ago, the New Left marched on the Pentagon, hoping to undermine the military-industrial complex behind the Vietnam War. Today, Trump attacks the FBI and the Justice Department, hoping to undermine a fantastical Mino­taur called “the deep state.” Fifty years ago, the counterculture hoped to bring about a world in which individuals could be more authentically themselves, and in which the hierarchies of organizations and states would disappear. Today, those hierarchical institutions are all that stand between us and a cult of personality.

If the communes of the 1960s teach us anything, they teach us that a community that replaces laws and institutions with a cacophony of individual voices courts bigotry and collapse. Without explicit, democratically adopted rules for distributing resources, the communes allowed unspoken cultural norms to govern their lives. Women were frequently relegated to the most traditional of gender roles; informal racial segregation was common; and charismatic leaders—almost always men—took charge. Even the most well-intentioned communes began to replicate the racial and sexual dynamics that dominated mainstream America. Lois Brand recalled that on the communes they visited, men would do “important stuff” like framing up domes, while she and the other women would put small amounts of bleach in the water to keep residents from getting sick.

For all their sophistication, the algorithms that drive Facebook cannot prevent the recrudescence of the racism and sexism that plagued the communes. On the contrary, social-media platforms have helped bring them to life at a global scale. And now those systems are deeply entrenched. Social-media technologies have spawned enormous corporations that make money by mapping and mining the social world. Like the extraction industries of previous centuries, they are highly motivated to expand their territories and bend local elites to their will. Without substantial pressure, they have little incentive to serve a public beyond their shareholders. Companies such as Facebook and Twitter are coming to dominate our public sphere to the same degree that Standard Oil once dominated the petroleum industry. They too should be subject to antitrust laws. We have every right to apply the same standards to social-media companies that we have applied to other extraction industries. We cannot allow them to pollute the lands they mine, or to injure their workers, nearby residents, or those who use their products.

As Columbia law professor Tim Wu has argued, social-media companies are enabling a new form of censorship by allowing human and robotic users to flood the inboxes of their enemies in an effort to keep them quiet, and there are little-used provisions of the First Amendment that could radically slow these processes. We also have alternatives to traditional private or stockholder ownership of our social media. We can already see some of the possibilities in sharing practices developed within the computer industry, such as open-source code and “copyleft” rights management. An international community of scholars and technologists has looked for some time at creating cooperatively owned online platforms. As Nathan Schneider, a professor at the University of Colorado and one of the movement’s leaders, has pointed out, member-owned cooperatives generate 11 percent of the electricity sold in America. If social media are equally important to our lives, he asks, why shouldn’t we take a hand in owning and managing them?

That question is a good one, but it doesn’t quite capture the historical specifics of our situation. The new authoritarianism represented by Spencer and Trump is not only a product of who owns today’s media. It’s also a product of the political vision that helped drive the creation of social media in the first place—a vision that distrusts public ownership and the political process while celebrating engineering as an alternative form of governance. Since the Second World War, critics have challenged the legitimacy of our civic institutions simply on the grounds that they were bureaucratic and slow to change. Yet organizations such as hospitals demonstrate the value of these features. They remind us that a democracy must do more than allow its citizens to speak. It must help them live. Above all, it must work to distribute our wealth more equably and to ensure that every member of society has both independence and security. This is work that requires intense negotiation among groups with conflicting material interests, and, often, deep-seated cultural differences. It requires the existence of institutions that can preserve and enforce the results of those negotiations over time. And it requires that those institutions be obliged to serve the public before tending to their own profits.

Today’s social media will never be able to do the difficult, embodied work of democracy. Computer-­supported interconnection is simply no substitute for face-to-face negotiation, long-term collaboration, and the hard work of living together. The Black Lives Matter and #MeToo movements have taught us that social media can be a powerful force for liberating us from the fiction that all is well just as it is. But the attention these activists have brought to their causes will mean little if the changes they call for are not enshrined in explicit, enforceable laws. Even though the American state can be inefficient, unfair, corrupt, and discriminatory, the logic of representation that underlies it remains the most effective engine we have for ensuring the equable distribution of our collective wealth.

Over time, as new media have saturated our public lives, and as the children of the 1960s have grown into the elites of today, we have learned that if we want a place on the political stage, we need to make our interior lives outwardly visible. We need to say who we are. We need to confess. When Richard Spencer calls himself a member of a victimized minority, or when Donald Trump bares his anger on Twitter, they are using the same tactics once deployed by the protesters of the 1960s or, for that matter, by participants in the #MeToo movement today. To make this observation is not to say that their causes are in any way equivalent—far from it. But whether they are lying like Trump or revealing long-buried truths like the members of #MeToo, those who would claim power in the public sphere today must speak in a deeply personal idiom. They must display the authentic individuality that members of the Committee for National Morale once thought could be the only bulwark against totalitarianism, abroad and at home.

Speaking our truths has always been necessary, but it will never be sufficient to sustain our democracy. It’s time to let go of the fantasy that engineers can do our politics for us, and that all we need to do to change the world is to voice our desires in the public forums they build. For much of the twentieth century, Americans on both the left and right believed that the organs of the state were the enemy and that bureaucracy was totalitarian by definition. Our challenge now is to reinvigorate the institutions they rejected and do the long, hard work of turning the truths of our experience into legislation.

 is Harry and Norman Chandler Professor of Communication at Stanford University.



| View All Issues |

January 2019

Close
“An unexpectedly excellent magazine that stands out amid a homogenized media landscape.” —the New York Times
Subscribe now

Debug