My son is vaccinated, but there is one immunization on the standard schedule he did not receive. This was meant to be his very first shot, the hepatitis B vaccine administered to most infants immediately after birth. I was aware, before I became pregnant, of some fears around vaccination. But I was not prepared for the labyrinthine network of anxieties I would discover during my pregnancy, the proliferation of hypotheses, the minutiae of additives, the diversity of ideologies. Vaccines contain preservatives, adjuvants, and residues from their manufacture. They were developed from aborted fetuses, were tested in Nazi concentration camps, and are not vegan. And vaccines are metaphors, if popular literature on vaccination is to be read as literature, for capitalist corruption, cultural decadence, and environmental pollution.
The reach of this subject had exceeded the limits of my late-night research by the time my baby was due, so I visited the pediatrician I had chosen to be my son’s doctor. I already knew that some people would consider a medical professional a dubious source of intelligence on vaccination. The money pharmaceutical companies are pouring into research, they would say, has made the information available to doctors dirty. But not all doctors are informed by research, as I would discover, and there is more than one route to unclean thinking.
When I asked the pediatrician what the purpose of the hep B vaccine was, he answered, “That’s a very good question,” in a tone I understood to mean this was a question he relished answering. Hep B was a vaccine for the inner city, he told me — it was designed to protect the babies of drug addicts and prostitutes. It was not something, he assured me, that people like me needed to worry about.
All that this doctor knew of me then was what he could see. He assumed, correctly, that I did not live in the inner city. It did not occur to me to clarify that although I live in the outer city of Chicago, my neighborhood looks a lot like what some people mean when they use the euphemism “inner city.” In retrospect, I am ashamed by how little of his racial code I registered. Relieved to be told that this vaccine was not for people like me, I failed to consider what exactly that meant.
The belief that public-health measures are not intended for people like us is widely held by people like me. Public health, we assume, is for people with less — less education, less healthy habits, less access to quality health care, less time and money. I’ve heard mothers of my class suggest, for instance, that the standard childhood vaccination schedule groups together multiple shots because poor mothers can’t visit the doctor frequently enough to get the twenty-six recommended shots separately. (No matter that many mothers, myself included, might find so many visits daunting.) That, we seem to be saying of the standard schedule, is for people like them.
When the last nationwide smallpox epidemic began in 1898, some people believed that whites were not susceptible to the disease. It was called “nigger itch” or, where it was associated with immigrants, “Italian itch” or “Mexican bump.” When smallpox broke out in New York City, police officers were sent to help enforce the vaccination of Italian and Irish immigrants in the tenements. And when smallpox arrived in Middlesboro, Kentucky, everyone in the black section of town who resisted immunization was vaccinated at gunpoint. These campaigns did limit the spread of the disease, but most of the risk of vaccination, which at that time could lead to infection with other diseases, was absorbed by the most vulnerable. The poor were forced into the service of the privileged.
Debates over vaccination, then as now, were often cast as debates over the integrity of science, though they could just as easily be understood as conversations about power. The working-class people who resisted England’s 1853 provision of free, mandatory vaccination were concerned, in part, for their own liberty. Faced with fines, imprisonment, and the seizure of their property if they did not vaccinate their infants, they sometimes compared their predicament to slavery. In her history of that antivaccination movement, Nadja Durbach returns often to the idea that the resisters saw their bodies “not as potentially contagious and thus dangerous to the social body, but as highly vulnerable to contamination and violation.” Their bodies were, of course, both vulnerable and contagious. But in a time and place where the bodies of the poor were seen as a source of disease, as dangerous to others, it fell to the poor to articulate that they were also vulnerable.
If it was meaningful then for the poor to assert that they were not purely dangerous, I suspect it might be just as meaningful now for the rest of us to accept that we are not purely vulnerable. The middle class may be “threatened,” but we are still, just by virtue of having bodies, dangerous. Even the little bodies of children, which nearly all the thinking common to our time encourages us to imagine as absolutely vulnerable, are dangerous in their ability to spread disease. Think of the unvaccinated boy in San Diego, for instance, who returned from a trip to Switzerland with a case of measles that infected his two siblings, five schoolmates, and four children in his doctor’s waiting room. Three of these children were infants too young to be vaccinated, and one of them had to be hospitalized.
Unvaccinated children, according to a 2004 analysis of CDC data, are more likely than undervaccinated children to be white and to live in households with an income of $75,000 or more — like my child. Their mothers are more likely to be, like me, married and college-educated. Undervaccinated children, meaning children who have received some but not all of their recommended immunizations, are more likely to be black, to have younger, unmarried mothers, and to live in poverty.
“Vaccination works,” my father, a doctor, tells me, “by enlisting a majority in the protection of a minority.” He means the minority of the population that is particularly vulnerable to a disease. The elderly, in the case of influenza. Newborns, in the case of pertussis. Pregnant women, in the case of rubella. When relatively wealthy white women choose to vaccinate our children, we may also be participating in the protection of poor black children whose single mothers have not, as a result of circumstance rather than choice, fully vaccinated them. This is a radical inversion of the historical approach to vaccination, which was once just another form of bodily servitude extracted from the poor for the benefit of the privileged. There is some truth now to the idea that public health is not strictly for people like me, but it is through us — literally through our bodies — that public health is maintained.
Vaccination is sometimes implicated in all the crimes of modern medicine. But vaccination was a precursor to modern medicine, not the product of it. Its roots are in folk medicine, and its first practitioners were farmers. Milkmaids in eighteenth-century England had faces unblemished by smallpox, as anyone could see. Common wisdom held that if a milkmaid milked a cow blistered with cowpox and developed some blisters on her hands, she would not contract smallpox even while nursing victims of an epidemic.
During an outbreak in 1774, a farmer who had himself already been infected with cowpox used a darning needle to force pus from a cow’s udder into the arms of his wife and two toddler boys. The farmer’s neighbors were horrified. His wife’s arm became red and swollen, and she fell ill before recovering from the infection, but the boys had mild reactions. They were exposed to smallpox many times over the course of their long lives, occasionally for the purpose of demonstrating their immunity, without ever contracting the disease.
Twenty years later, the country doctor Edward Jenner scraped pus from a blister on the hand of a milkmaid into an incision on the arm of an eight-year-old boy. The boy did not contract smallpox, and Jenner continued his experiment on dozens of other people, including his own infant son. Jenner had the evidence to suggest that vaccination worked, but he did not know why it worked. His innovation was based entirely on observation, not on theory. This was a century before the first virus would be identified, a century before germ theory would be validated, more than a century before penicillin would be extracted from a fungus, and long before the cause of smallpox would be understood.
The essential mechanism underlying vaccination was not new even in Jenner’s time. At that point variolation, the practice of deliberately infecting a person with a minor strain of smallpox in order to prevent infection with a more deadly strain, was still somewhat novel in England but had been practiced in China and India for hundreds of years. (In China, it was said to have been “bestowed by a Taoist immortal.”) Variolation would later be brought to America from Africa by a slave. It was then introduced to England by an ambassador’s wife, her own face scarred by smallpox, who inoculated her children after observing the practice in Turkey. Voltaire, himself a survivor of a serious case of smallpox, implored the French to adopt variolation from the English.
When Voltaire wrote his letter “On Inoculation” in 1735, the primary meaning of the English word “inoculate” was still “to set a bud or scion,” as apple trees are cultivated by grafting a stem from one tree onto the roots of another. There were many methods of inoculation, including the snuffing of powdered scabs and the sewing of an infected thread through the webbing between the thumb and forefinger, but in England it was often accomplished by making a slit or flap in the skin into which infectious material was placed, like the slit in the bark of a tree that receives the young stem grafted onto it. When “inoculate” was first used to describe variolation, it was a metaphor for grafting a disease, which would bear its own fruit, onto the rootstock of the body.
From somewhere deep in my childhood I can remember my father explaining with enthusiasm the principle behind the Doppler effect as an ambulance sped past our car. My father marveled at the world far more often than he talked about the body, but blood types were a subject on which he spoke with some passion. People with the blood type O negative, he explained, can receive in transfusion only blood that is O negative, but people with O-negative blood can give blood to people with any other type. That’s why a person with type O negative is known as a “universal donor.” My father then revealed that his blood type was O negative, that he himself was a universal donor. He gave blood, he told me, as often as he could because his type was always in demand. I suspect my father knew that my blood, too, is type O negative.
I understood the idea of the universal donor more as an ethic than as a medical concept long before I knew my own blood type. But I did not yet think of that ethic as an ingenious filtering of my father’s Catholic background through his medical training. I was not raised in the Church and I never took communion, so I was not reminded of Jesus offering of his blood that we all might live. But I believed, even then, that we owe each other our bodies.
The very first decision I made for my son, a decision enacted within moments of his body coming free of mine, was the donation of his umbilical-cord blood to a public bank. I myself had donated blood only once, and I wanted my son to start his life with a credit to the bank, not a debt. And this was before I, the universal donor, would become the recipient of two pints of blood in a transfusion shortly after my son’s birth. Blood of the most precious type, drawn from a public bank.
If we imagine the action of a vaccine not just in terms of how it affects a single body but also in terms of how it affects the collective body of a community, it is fair to think of vaccination as a kind of banking of immunity. Contributions to this bank are donations to those who cannot or will not be protected by their own immunity. This is the principle of “herd immunity,” and it is through herd immunity that mass vaccination becomes far more effective than individual vaccination.
Any given vaccine can fail to produce immunity in an individual, and some vaccines, like the influenza vaccine, often fail to produce immunity. But when enough people are given even a relatively ineffective vaccine, viruses have trouble moving from host to host and cease to spread, sparing both the unvaccinated and those in whom vaccination has not produced immunity. This is why the chances of contracting measles can be higher for a vaccinated person living in a largely unvaccinated community than for an unvaccinated person living in a largely vaccinated community.
The boundaries between our bodies begin to dissolve here. Blood and organs move between us, exiting one body and entering another, and so, too, with immunity, which is a common trust as much as it is a private account. Those of us who draw on collective immunity owe our health to our neighbors.
My father has a scar on his left arm from his smallpox vaccination half a century ago. That vaccine was responsible for the worldwide eradication of smallpox in 1980, but it remains far more dangerous than any vaccine currently on our childhood immunization schedule. The risk of death after vaccination for smallpox is, according to one estimate, about one in a million. The risk of hospitalization is about one in a hundred thousand, and the risk of serious complications is about one in a thousand.
Thirty years after routine vaccination for smallpox ended in this country, the federal government asked researchers at the University of Iowa to test the remaining stores of the vaccine for efficacy. This was in the long moment after 9/11 when every potential terrorist attack was anticipated, including the use of smallpox as a biological weapon. The smallpox vaccine proved effective even after having been stored for decades and diluted to increase the supply. But the results of the vaccine trial, says Patricia Winokur, director of the school’s Vaccine Research and Education Unit, were “unacceptable by today’s standards.” A third of the people who received the vaccine suffered fevers or rashes and were sick, in some cases, for several days. Everyone recovered, even those who developed serious inflammations of the heart, but it was clear that the degree of risk was not what we have come to expect from immunization.
The smallpox vaccine contains far more immunizing proteins — more of the active ingredient, so to speak — than any of the vaccines we use today. In that sense, the vaccine our parents received presented a greater challenge to the immune system in one dose than do the twenty-six immunizations for fourteen diseases we now give our children over the course of two years. Still, the proliferation of childhood vaccines has become, for some of us, symbolic of American excess. Too much, too soon, one of the slogans of vaccine activism, could easily be a critique of just about any aspect of our modern lives.
When asked by his colleagues to address the question of whether too many vaccines are given too early in life, the University of Pennsylvania pediatrics professor Paul Offit set out to quantify the capabilities of the infant immune system, which was already known to be quite impressive. Infants, after all, are exposed to an onslaught of bacteria the moment they leave the womb, even before they exit the birth canal. Any infant who does not live in a bubble is likely to find the everyday work of fighting off infections more taxing than processing weakened antigens from multiple immunizations.
Offit is the director of the Vaccine Education Center at the Children’s Hospital of Philadelphia and head of its Division of Infectious Diseases. He is also, if you believe the Internet, a “Devil’s servant” known as “Dr. Proffit.” He earned this distinction by co-inventing a vaccine that made him several million dollars. The idea that the success of his vaccine, which took twenty-five years to develop, should invalidate his expertise in immunology is somewhat baffling to Offit. But he understands the other source of his infamy. In response to the question of how many vaccines is too many, Offit determined that a child could theoretically handle a total of 100,000 vaccines or up to 10,000 vaccinations at once. He came to regret this number, though he does not believe it to be inaccurate. “The 100,000 number makes me sound like a madman,” he told Wired. “Because that’s the image: 100,000 shots sticking out of you. It’s an awful image.”
In a 2009 article for Mothering magazine, Jennifer Margulis expresses outrage that newborn infants are routinely vaccinated for hep B and wonders why she was encouraged to vaccinate her daughter “against a sexually transmitted disease she had no chance of catching.” Hep B is transmitted through bodily fluids, so the most common way that newborns contract hep B is from their mothers. Babies born to women who are infected with hep B — and mothers can carry the virus without knowing it — will very likely be infected if they are not vaccinated within twelve hours of birth. Like human papillomavirus and a number of other viruses, hep B is a carcinogen. Newborns infected with it are at a high risk of developing long-term problems like liver cancer, but people of all ages can carry the disease without symptoms. Before the vaccine for hep B was introduced, the disease infected 200,000 people a year, and about a million Americans were chronically infected.
One of the mysteries of hep B immunization is that vaccinating only “high-risk” groups, which was the original public-health strategy, did not bring down rates of infection. When the vaccine was introduced in 1981, it was recommended for prisoners, health-care workers, gay men, and IV-drug users. But rates of hep B infection remained unchanged until 1991, when the vaccine was recommended for all newborns. Only mass vaccination brought down the rates of infection, and since 1991 vaccination has virtually eliminated the disease in children.
Risk, in the case of hep B, turns out to be a rather complicated assessment. There is risk in having sex with just one partner, getting a tattoo, or traveling to Asia. In many cases, the source of infection is never known. I decided before my son’s birth that I did not want him vaccinated for hep B, but it did not occur to me until months later that although I did not belong to any risk groups at the moment he was born, by the time I put him to my breast I had received a blood transfusion and my status had changed.
“Everyone who is born holds dual citizenship, in the kingdom of the well and in the kingdom of the sick,” Susan Sontag wrote in her introduction to Illness as Metaphor. “Although we all prefer to use only the good passport, sooner or later each of us is obliged, at least for a spell, to identify ourselves as citizens of that other place.” Sontag wrote these words while being treated for cancer. She wrote, as she later explained, to “calm the imagination.” Those of us who have lived most of our lives in the kingdom of the well may find our imaginations already placid. Not all of us think of health as a transient state from which we may be exiled without warning. Some prefer to assume health as an identity. I am healthy, we say, meaning that we eat certain foods and avoid others, that we exercise and do not smoke. Health, it is implied, is the reward for living the way we live, and lifestyle is its own variety of immunity.
An 1881 handbill entitled The Vaccination Vampire warns of the “universal pollution” delivered by the vaccinator to the “pure babe.” The macabre sexuality of the vampire dramatized the fear that there was something sexual in the act of vaccination, an anxiety that was only reinforced when sexually transmitted diseases were spread through arm-to-arm vaccination. Until the advent of the hollow needle, vaccination often left a wound that would scar — “the mark of the beast,” some feared. In one 1882 sermon, vaccination was akin to an injection of sin, an “abominable mixture of corruption, the lees of human vice, and dregs of venial appetites, that in after life may foam upon the spirit, and develop hell within, and overwhelm the soul.”
While vaccination now rarely leaves a mark, our fears that we will be permanently marked have remained. We fear that vaccination will invite autism or any one of the diseases of immune dysfunction now plaguing industrialized countries — diabetes, asthma, allergies. We fear that the hep B vaccine will cause multiple sclerosis, or that the diphtheria-pertussis-tetanus vaccine will cause sudden infant death syndrome. We fear that the formaldehyde in some vaccines will cause cancer, or that the aluminum in others will damage our brains.
It was the “poison of adders, the blood, entrails and excretions of rats, bats, toads and sucking whelps,” that were imagined into the vaccines of the nineteenth century. This was the kind of organic matter, the filth, believed to be responsible for most disease at that time. Vaccination was dangerous then. Not because it would cause a child to grow the horns of a cow, but because it could spread diseases like syphilis when pus from one person was used to vaccinate another. Even when vaccination no longer involved an exchange of bodily fluids, bacterial infection remained a problem. In 1901, a contaminated batch of smallpox vaccine caused a tetanus outbreak that killed nine children in Camden, New Jersey. And in 1916, a typhoid vaccine carrying staph bacteria killed four children in Columbia, South Carolina.
Now our vaccines are, if all is well, sterile. Some contain preservatives to prevent the growth of bacteria. So now it is, in the antivaccine activist Jenny McCarthy’s words, “the frickin’ mercury, the ether, the aluminum, the antifreeze” that we fear in our vaccines. These substances are mostly, like the pollutants that threaten our environment today, inorganic. They are not of the body, or so we think. Although there is no ether or antifreeze in any vaccines, many do contain traces of the formaldehyde used to inactivate viruses. This can be alarming to those of us who associate formaldehyde with dead frogs in glass jars, but the chemical is produced by our bodies and is essential to our metabolism. The amount of formaldehyde already circulating in our systems is considerably greater than the amount we might receive through vaccination.
As for mercury, the ethylmercury preservative used in many vaccines until the late 1990s, thimerosal, is now only in some flu vaccines. Ethylmercury is cleared more easily by the body than the methylmercury often found in breast milk, and a child will almost certainly get more mercury exposure from her immediate environment than from vaccination. This is true too of aluminum, an adjuvant used in some vaccines to intensify the immune response. Aluminum is in a lot of things, including fruits and cereals as well as, again, breast milk. Our breast milk, it turns out, is as polluted as our environment. It contains paint thinners, dry-cleaning fluids, flame retardants, pesticides, and rocket fuel. “Most of these chemicals are found in microscopic amounts,” the journalist Florence Williams notes, “but if human milk were sold at the local Piggly Wiggly, some stock would exceed federal food-safety levels for DDT residues and PCBs.”
When my son was six months old, at the peak of the H1N1 flu pandemic, another mother told me that she did not believe in herd immunity. It had not yet occurred to me then that herd immunity was subject to belief, though there is clearly something of the occult in the idea of an invisible cloak of protection cast over an entire population. Herd immunity, an observable phenomenon, is implausible only if we think of our bodies as inherently disconnected from other bodies. Which, of course, we do.
One of the unfortunate features of the term “herd immunity” is that it invites association with the term “herd mentality,” a stampede toward stupidity. The herd, we assume, is foolish. Those of us who eschew the herd mentality tend to prefer a frontier mentality in which we imagine our bodies as isolated homesteads. The health of the homestead next to ours does not affect us, this thinking suggests, so long as ours is well tended.
If we were to recast the herd as a hive, perhaps the concept of shared immunity might be more appealing. Honeybees are industrious environmental do-gooders who also happen to be hopelessly interdependent. The health of any individual bee, as we know from the recent epidemic of colony collapse disorder, depends on the health of the hive.
The idea that our lives are dependent on our hive might not be very heartening. There are many well-documented instances of crowds making bad decisions — lynching is the first example that comes to mind for me. But the journalist James Surowiecki argues in The Wisdom of Crowds that large groups routinely solve complex problems whose solutions evade individuals. Groups of people, if they are sufficiently diverse and free to disagree, can provide us with thinking superior to that of any one expert. Groups can locate lost submarines, predict terrorist attacks, and reveal the cause of a new disease. Science, Surowiecki reminds us, is “a profoundly collective enterprise.” It’s a product of the herd.
But “herd immunity” suggests we are only so many cattle, waiting, perhaps, to be sent to slaughter. We may feel, when herded toward vaccination, that we are assuming the dumb submission of animals who look on passively as they are daily robbed of their babies’ milk. It is no wonder some mothers resist a metaphoric milking.
The Circassian women,” wrote Voltaire, “have, from time immemorial, communicated the smallpox to their children when not above six months old by making an incision in the arm, and by putting into this incision a pustule, taken carefully from the body of another child.” It was women who inoculated their children, and Voltaire mourned the fact that the “lady of some French ambassador” had not brought the technique from Constantinople to Paris. “What prompted the Circassians to introduce this custom, which seems so strange to others,” Voltaire wrote, “is a motive common to all: maternal love and self-interest.”
Medical care was still mainly the domain of women then, though the tradition of the female healer was already threatened. Midwives and wise women, guilty of crimes that included providing contraception and easing the pains of labor, were persecuted in the witch hunts that spread across Europe from the fifteenth to the eighteenth century. While women were being killed for their suspicious ability to heal the sick, physicians in European universities studied Plato and Aristotle but learned very little about the body. They did not experiment, did not practice science as we know it, and had little empirical data to support their treatments, which were often superstitious in nature. Women healers were also susceptible to superstition, but as far back as the early Middle Ages midwives were using ergot to speed contractions and belladonna to prevent miscarriage. St. Hildegard of Bingen catalogued the healing properties of 213 medicinal plants, and female lay healers were using recipes for painkillers and anti-inflammatories at a time when physicians were still writing scripture on the jaws of their patients to heal toothaches.
Benjamin Rush, one of the fathers of American medicine, bled his patients to, as the writers Barbara Ehrenreich and Deirdre English put it, “Transylvanian excesses.” In the late eighteenth and early nineteenth centuries, patients were bled until they fainted, dosed with mercury, and blistered with mustard plasters. While women were excluded from formal medical education, male physicians competed, sometimes aggressively, with their informal practice in the home. But the art of healing, as doctors were to discover, is rather difficult to commodify. It was the pressures of the marketplace, Ehrenreich and English suggest in For Her Own Good, that led to the practice of “heroic” medicine, which relied heavily on dangerous therapies like bleeding. The purpose of heroic medicine was not so much to heal the patient as to produce some measurable — and, ideally, dramatic — effect for which the patient could be billed. Rush, for one, was accused at the time of killing more patients than he cured.
As doctors began to replace midwives in the nineteenth century, childbirth moved into hospitals, and the maternal death rate rose. We now know that childbed fever, as puerperal sepsis was called, was spread by doctors who did not wash their hands between exams. But it was blamed on tight petticoats, fretting, and bad morals. In the twentieth century, poorly understood illnesses like schizophrenia would be blamed on bad mothers, as would marginalized behaviors such as homosexuality. Autism, according to a prevailing theory in the 1950s, was caused by insensitive “refrigerator mothers.”
Even a moderately informed woman squinting at the rough outlines of a terribly compressed history of medicine can discern that quite a bit of what has passed for science in the past two hundred years, particularly where women are concerned, has not been the result of scientific inquiry so much as it has been the refuse of science repurposed to support existing ideologies. In this tradition, Andrew Wakefield’s now retracted 1998 Lancet study of twelve children with both developmental disorders and intestinal problems advanced a hypothesis that was already in the air — the children were referred to Wakefield by antivaccine activists, and the study was funded by a lawyer preparing a lawsuit built around many of the same children. Wakefield speculated, on the basis of evidence later revealed to be falsified, that the measles-mumps-rubella vaccine might be linked to a behavioral syndrome. While the publicity around Wakefield’s paper precipitated a dramatic drop in vaccination against measles, the paper itself concluded, “We did not prove an association between measles, mumps, and rubella vaccine and the syndrome described,” and the primary finding of the study was that more research was needed. Those who went on to use Wakefield’s inconclusive work to support the notion that vaccines cause autism are guilty not of ignorance or science denial but of using weak science as it has always been used — to lend credibility to an idea that we want to believe for other reasons.
Believing that vaccination causes devastating diseases allows us to tell ourselves a story we already know — what heals may harm, and the sum of science is not always progress. “Women know very well that knowledge from the natural sciences has been used in the interests of our domination and not our liberation,” the science historian Donna Haraway writes. And this understanding, she observes, can render us less vulnerable to the seductive claims of absolute truth that are sometimes made in the name of science. But it can also invite us to undervalue the place and importance of scientific knowledge. We need science, Haraway warns. And where it is not built on social domination, science can be liberating.
It is difficult to read any historical account of smallpox without encountering the word “filth.” In the nineteenth century, smallpox was widely considered a disease of “filth,” which meant that it was understood to be a disease of the poor. According to filth theory, any number of contagious diseases were caused by bad air that had been made foul by excrement or rot. The sanitary conditions of the urban poor threatened the middle classes, who shuttered their windows against the air blowing off the slums at night. Filth, it was thought, was responsible not just for disease but also for immorality.
Filth theory would eventually be replaced by germ theory as an understanding of the nature of contagion, but it wasn’t entirely wrong or useless. Raw sewage running in the streets can certainly spread diseases — though smallpox isn’t one of them — and the sanitation efforts inspired by filth theory greatly improved public health. The reversal of the Chicago River, for instance, so that sewage was not delivered directly to Lake Michigan, the city’s drinking-water supply, had some obvious benefits for the citizens of Chicago.
Now, the mothers I meet on the beaches of Lake Michigan do not worry much over filth. Some of us are familiar with the hygiene hypothesis, the notion that a child’s immune system needs to encounter germs to develop properly, and most of us believe that dirt is good for our kids. But the idea that toxins, rather than filth or germs, are the root cause of most maladies is a popular theory of disease among people like me. The toxins that distress us range from pesticide residue to high-fructose corn syrup. Particularly suspect substances include the bisphenol A lining our tin cans, the phthalates in our shampoos, and the chlorinated Tris in our couches and pillows.
The definition of “toxin” can be somewhat surprising if you have grown accustomed to hearing it in the context of flame retardants and parabens. Though “toxin” is now often used to refer to manmade chemicals, the more precise meaning of the term is reserved for biologically produced poisons. The pertussis toxin, for example, is responsible for damage to the lungs that can cause whooping cough to linger for months after the bacteria that produced it have been killed by antibiotics. The diphtheria toxin is potent enough to cause massive organ failure, and tetanus bacteria produce a deadly neurotoxin. All of which we now protect against with vaccination.
Though “toxoid” is the term for a toxin that has been rendered no longer toxic, the existence of a class of vaccines called toxoids probably does not help quell widespread concerns that vaccination is a source of toxicity. The consumer advocate Barbara Loe Fisher routinely stokes these fears, referring to vaccines as “biological agents of unknown toxicity” and calling for the development of “nontoxic” preservatives and for more studies on the “toxicity of all other vaccine additives” and their potential “cumulative toxic effects.”
In this context, fear of toxicity strikes me as an old anxiety with a new name. Where the word “filth” once suggested, with its moralistic air, the evils of the flesh, the word “toxic” now condemns the chemical evils of our industrial world. This is not to say that concerns over environmental pollution are unjustified — like filth theory, toxicity theory is anchored in legitimate dangers.
The way we now think about toxicity bears some resemblance to the way we once thought about filth. Both theories imagine urban environments as inherently unhealthy. And both allow their subscribers to maintain a sense of control over their own health by pursuing personal purity. For the filth theorist, this meant a retreat into the home, where heavy curtains and shutters might seal out the smell of the poor and their problems. Our version of this shuttering is now achieved through the purchase of purified water, air purifiers, and food marketed with the promise of purity.
Purity is the seemingly innocent concept behind a number of the most sinister social actions of the past century. A passion for bodily purity drove the eugenics movement and led to the sterilization of women and men who were deaf, blind, disabled, or just poor. Concerns for bodily purity were behind miscegenation laws that persisted more than a century after the abolition of slavery, and behind the sodomy laws that were only recently declared unconstitutional. Quite a bit of human solidarity, it seems, has been sacrificed to preserve some kind of imagined purity.
If we do not yet know exactly what the presence of a vast range of chemicals in umbilical-cord blood and breast milk might mean for the future of our children’s health, we do at least know that we are no cleaner, even at birth, than our environment at large. We have more microorganisms in our guts than we have cells in our bodies — we are crawling with bacteria and we are full of chemicals. We are, in other words, continuous with everything here on earth. Including — and especially — each other.