Ralph Waldo Emerson’s lecture “The American Scholar,” which he delivered in 1837, implicitly raises radical questions about the nature of education, culture, and consciousness, and about their interactions. He urges his hearers to make the New World as new as it ought to be, urges his audience to outlive the constraints that colonial experience imposed on them and to create the culture that would arise from the full and honest use of their own intellects, minds, and senses. Any speaker might say the same to any audience. Every generation is in effect colonized by its assumptions, and also by the things it reveres. The future, in American experience, has always implied inevitable departure from the familiar, together with the possibility of shaping inevitable change. The historical circumstances of the country at the time Emerson spoke made vivid what is always true: that there is a frontier, temporal rather than geographical, which can and surely will be the new theater of old crimes and errors, but can and will also be an enlargement of experience, a region of indeterminacy, of possibility.
In his introduction to Democracy in America, Alexis de Tocqueville says a striking thing about the world that was then unfolding:
From the moment when the exercise of intelligence had become a source of strength and wealth, each step in the development of science, each new area of knowledge, each fresh idea had to be viewed as a seed of power placed within people’s grasp. Poetry, eloquence, memory, the beauty of wit, the fires of imagination, the depth of thought, all these gifts which heaven shares out by chance turned to the advantage of democracy and, even when they belonged to the enemies of democracy, they still promoted its cause by highlighting the natural grandeur of man. Its victories spread, therefore, alongside those of civilization and education.
Tocqueville, like Emerson, stood at a cusp of history where literacy and democracy were assuming an unprecedented importance in the civilization of the West. Though not unambivalent in his feelings about democracy, Tocqueville did see it as based on “the natural grandeur of man,” and brought to light by education. Poetry, eloquence, memory, wit, the fires of imagination, the depth of thought: these are mentioned as rarely now as the object or effect of education as “the natural grandeur of man” is mentioned as a basis of our culture or politics.
Emerson was speaking at a moment when colleges were being founded all across America — my own university, Iowa, in 1847. At that time the great Frederick Law Olmsted was putting his aesthetic blessing on our public spaces, and notably on college campuses. The Oxford English Dictionary defines “campus” as an Americanism. The conventions established in the early nineteenth century have persisted in the meadows and gardens and ponds that celebrate, if only out of habit, these cities of the young, these local capitals of learning and promise. Olmsted, like Emerson, would have seen something like the emergence of brilliant individuality in unexpected places that Tocqueville describes. This individuality was strongly potential in American life, though as yet suppressed, according to Emerson, by a preoccupation with the practical, with trade and enterprise, and suppressed as well by a colonial deference to the culture of Europe. Like Tocqueville, Emerson is proposing an anthropology, proposing that there is a splendor inherent in human beings that is thwarted and hidden by a deprivation of the means to express it, even to realize it in oneself. The celebration of learning that was made visible in its spread into the territories and the new states must have taken some part of its character from the revelation of the human gifts that education brought with it. It is interesting to see what persists over time, and interesting to see what is lost.
For those to whom Emerson is speaking, who have made a good account of themselves as students at Harvard, deprivation is the effect of an unconscious surrender, a failure to aspire, to find in oneself the grandeur that could make the world new. We know these people. In fact we are these people, proudly sufficient to expectations, our own and others’, and not much inclined to wonder whether these expectations are not rather low. We have, of course, accustomed ourselves to a new anthropology, which is far too sere to accommodate anything like grandeur, and which barely acknowledges wit, in the nineteenth-century or the modern sense. Eloquence might be obfuscation, since the main project of the self is now taken by many specialists in the field to be the concealment of selfish motives. How do we define imagination these days, and do we still associate it with fires? Unless it is escape or delusion, it seems to have little relevance, for good or ill, to the needs of the organism. So, like character, like the self, imagination has no doubt by now been defined out of existence. We leave it to a cadre of specialists to describe human nature — a phrase that by their account no doubt names yet another nonexistent thing. At best, these specialists would show no fondness for human nature if they did concede its existence, nor do they allow to it any of the traits that it long found ingratiating in itself. This is so true that the elimination of the pleasing, the poignant, the tragic from our self-conception — I will not mention brilliance or grandeur — would seem to be the object of the exercise. Plume-plucked humankind. Tocqueville and Emerson might be surprised to find us in such a state, after generations of great freedom, by the standards of history, and after the vast elaboration of resources for learning in every field.
Indeed, it is this vast elaboration, epitomized in the American university, that proves we once had a loftier view of ourselves, and it is a demonstration of the change in our self-conception that our universities no longer make sense to legislatures and to “people of influence” — a phrase that, in our moment, really does mean moneyed interests. Traditional centers of influence — churches, unions, relevant professionals — have lost their place in public life, or, speaking here of those churches that do maintain a public presence, they have merged their influence with the moneyed interests. From the perspective of many today, the great public universities (and many of them are very great) are like beached vessels of unknown origin and intention, decked out preposterously with relics and treasures that are ripe for looting, insofar as they would find a market, or condemned to neglect and decay, insofar as their cash value is not obvious to the most stringent calculation.
There has been a fundamental shift in American consciousness. The Citizen has become the Taxpayer. In consequence of this shift, public assets have become public burdens. These personae, Citizen and Taxpayer, are both the creations of political rhetoric. (It now requires an unusual degree of historical awareness to know that both politics and rhetoric were once honorable things.) An important aspect of human circumstance is that we can create effective reality merely by consenting to the phantasms of the moment or of the decade. While the Citizen can entertain aspirations for the society as a whole and take pride in its achievements, the Taxpayer, as presently imagined, simply does not want to pay taxes. The societal consequences of this aversion — failing infrastructure, for example — are to be preferred to any inroad on his or her monetary fiefdom, however large or small. Like limits on so-called Second Amendment rights, this is a touchy point. Both sensitivities, which are treated as though they were protections against centralization and collectivism, are having profound consequences for our society as a whole, and this without meaningful public debate, without referendum.
Citizenship, which once implied obligation, is now deflated. It is treated as a limited good that ought to be limited further. Of course, the degree to which the Citizen and the Taxpayer ever existed, exist now, or can be set apart as distinct types is a question complicated by the fact that they are imposed on public consciousness by interest groups, by politicians playing to constituencies, and by journalism that repeats and reinforces unreflectingly whatever gimmicky notion is in the air. It can be said, however, that whenever the Taxpayer is invoked as the protagonist in the public drama, a stalwart defender of his own, and a past and potential martyr to a culture of dependency and governmental overreach, we need not look for generosity, imagination, wit, poetry, or eloquence. We certainly need not look for the humanism Tocqueville saw as the moving force behind democracy.
I will put aside a fact that should be too obvious to need stating: that America has done well economically, despite passing through difficult periods from time to time, as countries will. It would be very hard indeed to make the case that the Land Grant College Act, the 1862 federal law that gave us many of our eminent public universities, has done us any economic harm, or that the centrality of the liberal arts in our education in general has impeded the country’s development of wealth. True, a meteor strike or some equivalent could put an end to everything tomorrow. But if we were obliged to rebuild ourselves we could not find a better model for the creation of wealth than our own history. I do not mean to suggest that wealth is our defining achievement, or that it is the first thing we should protect. But since money is the measure of all things these days it is worth pointing out that there are no grounds for regarding our educational culture as in need of rationalization — it must be clear that I take exception to this use of the word — to align it with current economic doctrine.
All this sidesteps the old Kantian distinction — whether people are to be dealt with as means or as ends. The argument against our way of educating is that it does not produce workers who are equipped to compete in the globalized economy of the future. This has to be as blunt a statement as could be made about the urgency, currently felt in some quarters and credulously received and echoed everywhere, that we should put our young to use to promote competitive adequacy at a national level, to whose profit or benefit we are never told. There is no suggestion that the gifts young Americans might bring to the world as individuals stimulated by broad access to knowledge might have a place or value in this future, only that we should provide in place of education what would better be called training.
If all educational institutions feel this pressure to some degree, public institutions feel it most continuously and profoundly. A university like mine, founded almost 170 years ago, before Iowa even had much in the way of cornfields, gives unambiguous evidence of the kinds of hopes that lay behind its establishment and sustained it through many generations. From an early point the University of Iowa emphasized the arts. It was founded while Emerson was active, and at about the time Tocqueville was published, and can fairly be assumed to have shared their worldview. The same is true of many public universities in America. Accepting creative work toward a graduate degree — the M.F.A. as we know it now — was an Iowa innovation. My own program, the Writers’ Workshop, is the oldest thing of its kind on the planet. People do ask me from time to time why Iowa is in Iowa. For the same reason Bloomington is in Indiana, no doubt. If we were better pragmatists, we would look at the fact that people given a relatively blank slate, the prairie, and a pool of public resources (however modest) are at least as desirous of the wonderful as of the profitable or necessary.
Atavism is a potent force in human history. The pull of the retrograde, an almost physical recoil, is much more potent than mere backsliding, and much more consequential than partial progress or flawed reform. The collective mind can find itself reinhabited by old ideas unwillingly, almost unconsciously. A word around which retrograde thinking often constellates is “elitist.” Liberal education was for a very long time reserved to an elite — whence the word “liberal,” befitting free men — who were a small minority in Western societies. Gradually, except by the standards of the world at large, Americans began democratizing privilege. As Tocqueville remarks, heaven shares out by chance those high gifts of intellect and culture that had previously been associated arbitrarily with status and advantage, which are now manifest as a vastly more generous endowment. We need only allow the spread of learning to see the potential for brilliance in humankind.
But the memory persists that the arts were once social attainments and that the humanities suited one to positions of authority. What use could this education be to ordinary people? What claim should something of such doubtful utility have on the public purse? Or, to look at the matter from another side, why should an English class at the University of Wisconsin be as excellent as an English class at Stanford University, for a mere fraction of the cost? The talk about “top-tier universities,” about supposed rankings, that we hear so often now creates an economics of scarcity in the midst of an astonishing abundance. And it helps to justify assaults on great public resources, of the kind we have seen recently in Wisconsin, under Scott Walker, and elsewhere. Public universities are stigmatized as elitist because they continue in the work of democratizing privilege, of opening the best thought and the highest art to anyone who wants access to them. They are attacked as elitist because their tuition goes up as the support they receive from the government goes down. The Citizen had a country, a community, children and grandchildren, even — a word we no longer hear — posterity. The Taxpayer has a 401(k). It is no mystery that the former could be glad to endow monumental libraries, excellent laboratories, concert halls, arboretums, and baseball fields, while the latter simply can’t see the profit in it for himself.
There is pressure to transform the public university so that less cost goes into it and more benefit comes out — as such things are reckoned in terms of contemporary economic thinking. That this economics could be so overbearingly sure of itself ought to be remarkable given recent history, but its voice is magnified in the void left by the default of other traditional centers of authority. In any case, whether and how we educate people is still a direct reflection of the degree of freedom we expect them to have, or want them to have. Since printing became established in the West and the great usefulness of literacy began to be recognized, it has been as characteristic of cultures to withhold learning as it has to promote it. Many cultures do both, selectively, and their discrimination has had profound effects that have persisted into the present, as they will certainly persist into the future. In most Western cultures the emergence of literacy in women lagged far behind literacy in men, and in many parts of the world women are still forbidden or discouraged from reading and writing, even though limiting them in this way radically slows economic development, among other things.
Western “progress” is an uncertain road to happiness, I know. But insofar as Western civilization has made a value of freeing the mind by giving it ability and resources, it has been a wondrous phenomenon. Wherever its strong, skilled attention has fallen on the world, it has made some very interesting errors, without doubt, and has also revealed true splendors. In either case it has given us good reason to ponder the mind itself, the character of the human mind that is so richly inscribed on the cultural experience of all of us, in the ways and degrees that we, as individuals, are prepared to read its inscriptions.
“Insofar as.” American civilization assumes literacy, saturates our lives with print. It also fails to make an important minority of its people competent in this skill, which most of us could not imagine living without. Old exclusions come down the generations — we all know that parents are the first teachers. Old injustices come down the generations — why bother to educate people who have no use for education, the hewers of wood and drawers of water? The argument was made that peasants, women, slaves, and industrial workers would be happier knowing nothing about a world that was closed to them in any case. For a very long time that world was closed to them, and they could be assumed to be ignorant of it. To the degree that all this has changed, social equality and mobility have followed. Many traditional barriers are lower, though they have not yet fallen.
What exactly is the impetus behind the progressive change that was simultaneous with the emergence of modern society? Our era could well be said to have had its origins in a dark age. The emergence of the factory system and mass production brought a degree of exploitation for which even feudalism had no equivalent. The severest possible cheapening of labor in early industry was supported by the same theories that drove colonialism and chattel slavery. The system yielded spectacular wealth, of course, islands of wealth that depended on extreme poverty and the profounder impoverishments of slavery. Comparisons are made between slavery and so-called free labor that seem always to imply the second was more efficient than the first, and therefore destined, on economic grounds, to become the dominant system, which would mean a general amelioration of conditions. It is a very imprecise use of language, however, to describe as free a labor force that was largely composed of children, who, on the testimony of Benjamin Disraeli among many others, could not and did not expect to live far beyond childhood. It is an imprecise use of language to call free the great class of laborers who, outside the parishes where they were born, had no rights even to shelter. The fact of social progress has been treated as a demonstration that laissez-faire works for us all, that the markets are not only wise but also benign, indeed humane. This argument is based on a history that was effectively invented to serve it, and on a quasi theology of economic determinism, a monotheism that cannot entertain the possibility that social circumstances might have any other origin than economics. Clever as we think we are, we are enacting again the strange — and epochal — tendency of Western civilization to impoverish.
We have come to a place where these assumptions are being tested against reality, if not in our universities and think tanks. Reality is that turbulent region our thoughts visit seldom and briefly, like Baedeker tourists eager to glimpse the sights that will confirm their expectations and put them on shared conversational ground with decades of fellow tourists. We leave trash on Mount Everest, we drop trash in the sea, and reality goes on with its life, reacting to our depredations as it must while ages pass, continents clash, and infernos boil over. It is true that our carelessness affects the world adversely, and it is also true that the world can fetch autochthonous surprises up out of its fiery belly. The metaphor is meant to suggest that we are poor observers, rarely seeing more than we intend to see. Our expectations are received, and therefore static, which makes it certain that they will be like nothing in reality. Still, we bring our expectations with us, and we take them home with us again, reinforced.
Historical time also has a fiery belly and a capacity for devastating eruptions. It has equivalents for drought and desert, for glaciation. Its atmosphere can dim and sicken. Any reader of history knows this. If the changes that occur in the past are substantially the result of human activity, they nevertheless rarely reflect human intention, at least when they are viewed in retrospect. They seem always to elude human notice until they are irreversible, overwhelming.
Western civilization once had a significant place among world cultures in articulating a sense of vastness and richness through its painting, poetry, music, architecture, and philosophy. Then, rather suddenly, this great, ancient project was discountenanced altogether. The sacred was declared a meaningless category, a name for something compounded of fear, hope, and ignorance, the forgivable error of an earlier age that was configured around certain ancient tales and ceremonies. That it yielded works of extraordinary beauty and profundity was acknowledged fulsomely in modernist nostalgia, whose exponents saw themselves as the victims of the transformations they had announced, and, to a considerable degree, created. Grand-scale change was imminent and inevitable, of course. Empires were falling, technology was rising. Whether the new age needed to bring with it this mawkish gloom is another question.
I propose that the thought we call modern was by no means robust and coherent enough intellectually to discredit metaphysics or theology, though it did discredit them. To account for this I would suggest that modernism’s appeal to the upper strata of culture was the way that it closed questions rather than opening them. Modernism’s nostalgias made it resistant to reform, and excused it from giving a rational defense for that resistance. It created the narrative of a breach with the European past by ignoring European history. After the horrors of the First World War, Europe seems to have comforted itself by arming for the Second World War. That is to say, humankind has always given itself occasions for grief and despair, and has seldom made better use of them than to store up grudges and provocations to prepare the next occasion.
Human cultural achievements may be thought of as somewhat separate from these periodic rampages. There are no grounds for thinking that high civilizations are less violent than any others — no grounds, that is, except in the histories and anthropologies that are written by them. So long as warfare and other enormities are treated as paradoxical, anomalous, and aberrant, we will lack a sufficiently complex conception of humankind. History can tell us that neither side of our nature precludes the other. The badness of the worst we do does not diminish the goodness of the best we do. That our best is so often artistic rather than utilitarian, in the usual senses of both words, is a truth with which we should learn to be at peace.
Since Plato at least, the arts have been under attack on the grounds that they have no useful role in society. They are under attack at present. We have convinced ourselves that the role of the middle ranks of our population is to be useful to the economy — more precisely, to the future economy, of which we know nothing for certain but can imagine to be as unlike the present situation as the present is unlike the order that prevailed a few decades ago. If today is any guide, we can anticipate further profound disruption. Whatever coherence the economy has created in the culture to this point cannot be assumed. The reverence paid to economic forces, as well as the accelerating accumulation of wealth in very few hands, increasingly amounts to little more than faceless people with no certain qualifications playing with money, and enforces the belief that our hopes must be surrendered to these forces. The coherence that society might take from politics — that is, from the consciousness that it is a polity, a human community with a history and an aspiration toward democracy, the latter of which requires a capacity for meaningful decisions about its life and direction — exists apart from these forces and is at odds with them. So far as those forces are determining, and so long as they succeed in defining utility, value, and legitimacy for the rest of us, we will have surrendered even the thought of creating a society that can sustain any engagement or purpose beyond that endless openness and submissiveness to other people’s calculations and objectives that we call competition.
At the moment, two propositions are taken to be true: that our society must be disciplined and trained to compete in a global market, and that these competitive skills have no definable character. Who might not be displaced by a computer or a robot? Who might not be displaced by a foreign worker or an adjunct? Economics from its brutal beginnings has told us that cheap labor will give its employers a competitive advantage, and that costly labor will drive industries into extinction or into foreign markets. (Oddly, the monstrously costly executives who work in these industries seem never to have this effect.) Nineteenth-century economics told us that labor both creates value and is the greatest expense in the production of value. When Marx wrote about these things he was using a vocabulary that is still descriptive, and therefore useful, now. Nationalism was involved in all this, historically. The colonial system, which was entirely bound up with trade and industry, enjoyed the power and the grandeur of the old European empires. The global reach of the early industrial system made mass poverty a national asset, as it remains. According to the theory that rationalized the system, a worker’s wage should not exceed the level necessary for his subsistence, that is, the minimum level necessary to leave him, more probably her, physically able to work. This brazen law, as they called it, is still in force in many of those societies with which we are told we are competing.
The most obvious evidence that the United States proceeded for a long time on other assumptions is our educational system, which is now seen by many people as an obstacle to recruiting ourselves to the great project of competing in the world economy. If it is no longer clear what these singular institutions should be doing, it is pretty clear what they should not be doing, which is disseminating knowledge and culture, and opening minds. The dominant view today is that the legitimate function of a university is not to prepare people for citizenship in a democracy but to prepare them to be members of a skilled but docile working class.
It has been characteristic of American education to offer students a variety of fields of study and a great freedom to choose among them. This arrangement has served as a mighty paradigm for the kind of self-discovery that Americans have historically valued. Now this idea has gone into eclipse. The freedom of the individual has been reduced to a right to belligerent ignorance coupled with devotion to a particular reading of the Second Amendment. Other than this, Americans are offered a future in which their particular interests, gifts, and values will have minimal likelihood of expression, since those interests, gifts, and values are not likely to suit the uses of whatever employment is on offer. Americans today must give up the thought of shaping their own lives, of having even the moral or political right to try to stabilize them against the rigors and uncertainties of the markets, those great gods. All this speculation assumes, of course, something that has never been true in this country, that society and the economy will be dominated by great industries that make more or less uniform demands of their workers, as those primordial cotton mills did. We are encouraged to accept the inevitability of a dystopia, our brightest hope being that we or our children will do better than most people.
An early step in making all this inevitable is the attack on higher education, a resurgence of that old impulse to force society into the form that is considered natural and necessary, an impulse that grows stronger whenever society as it exists seems recalcitrant. In general, of course, our system of higher education could hardly be less suited to serve the unspecific but urgent purposes of the future economy that has been imagined for us, which really bears more resemblance to our forgotten past, the dark age, that taught the few how to wring benefits from cheapening the labor, that is to say, depressing the living standards, of the many. This object has been mightily assisted by high levels of unemployment and by access to labor markets in countries where severe poverty is endemic.
One rationale for all this is that Western prosperity sprang from the dark age of early industrialism. Post hoc, ergo propter hoc. This is a tendentious reading of a very complex history. Henry Ford’s realization that workers should be paid enough to be able to afford the things they made, an inversion of the old model, might have come from his experiencing the limits of what Marx quite accurately called the expropriation of the worker. The so-called consumer society of midcentury America took notice of Ford’s innovation. A living wage was a novel idea, an object of derision at home and abroad; it is what we are losing now as unemployment and the exploitation of cheap labor in other countries exert a downward pressure on wages here. A consumer society is one in which people can engage in discretionary spending. It is characteristic of Americans that whatever they are or do is what they also ridicule and lament. But the margin above that grim old standard of subsistence is the margin of personal freedom. And, granting fads and excesses, interesting uses have been made of it.
As one important instance, for a long time, by global and historical standards, we have educated a great many people at great cost. Our discretionary spending has also been expressed as a willingness to be taxed for the common benefit. Without question the institutions that we have created have added tremendous wealth to society over time. But we have been talked out of the kind of pragmatism that would allow us to say: This works. No comparably wealthy society has proceeded on the utilitarian principles that would ration access to learning in the name of improving a workforce. One might reply that there has never been a country to compare with the United States in terms of wealth. True enough. But one conclusion we might draw from this state of affairs is that what we did worked. At very least, our promotion of education has not impeded the great cause of wealth accumulation that is never the stated object of competition, any more than are liberty, the pursuit of happiness, the general welfare, or, indeed, the examined life. By competition’s atavistic lights we should be prepared to navigate uncharted seas, assuming always that nothing else is offered or owed to us but work, which is itself highly conditional, and allowed to us only so long as no cheaper arrangement can be found. With all the urgency of this argument, holding up to us this dismal, threatening future, no mention is made that great wealth will indeed accumulate.
The most efficient system ever devised for the distribution of wealth is a meaningful wage. The most effective means of depressing wages is the mobility of capital, which can move overnight to a more agreeable climate. Our workers — that is, all of us, more or less — are already prepared to understand that, in abandoning America, capital is only obeying the brazen law. There will be a hint of rebuke, as there is now, that we failed to be competitive, which might have something to do with currency fluctuations, or with the misfortune of living in a place that is a little too fastidious in the matter of breathable air and drinkable water. This rebuke will echo through society, demonstrating to many the absolute need to jettison these standards, their immediate cost precluding any thought of their long-term benefit. Public health, too, is distributive, since it entails particular costs to achieve a general benefit that is associated with a society’s ability to prosper. (I avoid the more familiar term “redistributive,” which implies that there is a real, prior ownership of wealth that exists before society’s portion. This notion is not supportable, philosophically or economically, though at the moment it is very powerful indeed.)
There is a great scrum going on, now and also forever, since this model of economy has taken over so much of the world. China seems to be slowing, so India will emerge, presumably, as the next great competitor. We will marvel at the vast increase in one measure or another of its well-being, as we always do, which would occur in any country with a big enough population, since there are smart people everywhere, and since infusions of money and industrial technology will have these effects anywhere. The country’s middle class will expand — as ours would, given capital for new infrastructure and manufacturing, though our past growth would mean that any future increases were less dramatic. And, at some point, these measures of the prosperity of, say, India would begin to be presented to us as an alarming trend. They would be taken to mean that we were about to be vanquished, left in the dusts of mediocrity and decline. Our cult of competition does not seem able to entertain the idea that two or more countries might flourish simultaneously, unless, of course, they are European. It does not permit the thought that our response to the economic rise of India or China or Brazil might properly be to say: Good for them. This is the result of the old habits of nationalism. We are always ready to be persuaded that we are under threat. While care for our terrain and our people and our future can be tarnished as socialism, to be swept up in a general alarm that impoverishes all of those is presented as somehow American. What nonsense.
Let us call the stripping down of our society for the purposes of our supposed economic struggle with the world the expropriation of our workers. Academics have made absurd uses of Marx’s categories so often that even alluding to those categories involves risks. But Marx was critiquing political economy, and so am I. And nothing was more powerful in that brutal old system, now resurgent, than low wages. Americans think that Marx was criticizing America, an error of epochal consequence, and one that is propagated in the university as diligently as in any right-wing think tank. They think that political economy, that is, capitalism, was our invention and is the genius of our civilization — the greatest ever, by the grace of capitalism, so they say. Indeed, unread books may govern the world, and not well, since they are so often taken to justify our worst impulses and prejudices. (The Bible is a case in point.)
It is certainly true that as colonies created as extensions of the British industrial system, involving notoriously the Atlantic slave trade, our earliest beginnings have always haunted us and harmed us, too. Still, access to land, scarcity of labor and labor’s mobility, and the communitarian ethos of the northern colonies changed the economy in fact, then in theory. The iron law of wages lacked the conditions that enforced it in the old country. If Jefferson’s “happiness” is given its frequent eighteenth-century meaning — i.e., prosperity or thriving — then the pursuit of happiness, a level of life above subsistence, would become possible as it had not been for any but the most exceptional members of the British working class. If the system described by political economy is capitalism, then the American colonies began to veer away from capitalism long before the Revolution. In this new environment only chattel slavery could preserve certain of its essential features, notably the immobilization of a workforce that was maintained at the level of subsistence, and a radical polarization of wealth. These were the plantations of the British industrial system, producing cotton for the mills of Manchester. It is remarkable how often they are treated as if they were preindustrial, precapitalist, as if cotton and indigo were comestibles and sugar a dietary staple, and as if the whole arrangement did not run on credit and speculation.
Obviously I am critical of the universities too. They give prestige to exactly the kind of thinking that undermines their own existence as humanist institutions, especially in economics but also in many fields that are influenced by economics, for example psychologies that subject all actions and interactions to cost-benefit analysis, to — the phrase should make us laugh — rational choice. And this bleeds, of course, into the humanities, which are utterly, hopelessly anomalous by these lights, and have run for cover to critical theory. In every case the conception of what a human being is, and with it the thought of what she might be, is made tedious and small. The assumption now current, that the test of a university is its success in vaulting graduates into the upper tiers of wealth and status, obscures the fact that the United States is an enormous country, and that many of its best and brightest may prefer a modest life in Maine or South Dakota. Or in Iowa, as I find myself obliged to say from time to time. It obscures the fact that there is a vast educational culture in this country, unlike anything else in the world. It emerged from a glorious sense of the possible and explored and enhanced the possible through the spread of learning. If it seems to be failing now, that may be because we have forgotten what the university is for, why the libraries are built like cathedrals and surrounded by meadows and flowers. They are a tribute and an invitation to the young, who can and should make the world new, out of the unmapped and unbounded resource of their minds.