The personal tragedy that shaped Joe Biden’s life was essential to the narrative of his 2020 campaign for president, as it had been during his past campaigns and for most of his political life. In January 2020, before the pandemic killed hundreds of thousands of Americans, The New York Review of Books had already described Biden as our “designated mourner.” As deaths from the coronavirus began to rise, National Review suggested that Biden’s empathy might be his most important quality in this “time of national sorrow.” The New York Times hailed Biden as an “emissary of grief,” and the Washington Post declared him our “mourner in chief.”
In his 2017 memoir, Promise Me, Dad, Biden describes the deaths of his wife and young daughter in a 1972 car accident, which also badly injured his two sons, as a tragedy that left him devastated: “Losing Neilia and Naomi had taken all the joy out of being a United States senator; it had taken all the joy out of life.” He writes about coming to understand “how suicide wasn’t just an option but a rational option,” though he was held back by his responsibility to his boys, Beau and Hunter. The book goes on to talk about Beau’s death from brain cancer at the age of forty-six. Biden visited his grave, and the graves of Neilia and Naomi, on the morning of Election Day, as well as on the morning after the race was finally called in his favor.
There is no word in the English language for the parent of a dead child. No equivalent of widow, widower, or orphan, even of fatherless or motherless—words denoting losses so grave that they assign people to new human categories. Do we lack such a word because that grief is the most tragic of all family losses, the hardest to contemplate? Or is it possible that the opposite is true—that throughout history, the likelihood that a parent would lose a child was so high that such a term would have been a useless distinction?
If such a term existed, it would have applied to every eighteenth- and early-nineteenth-century president who had children.* The experience, though not unexpected, was tragic and life-changing, and the sorrow endured. John Quincy Adams wrote to his mother, Abigail, in 1812: “We have lost our dear and only daughter . . . as lovely and promising a child, as ever was taken from the hopes of the fondest parent.” Adams was on a lengthy diplomatic mission to Russia at the time, and the baby had been born—and had died—in St. Petersburg. Adams certainly grieved for his daughter Louisa, but it would probably not have occurred to him—or to his biographers—that her death was one of the defining events of his life or that it would change him as a statesman; such losses were too common and too predictable for that.
In 1851, Charles Dickens wrote to his wife, Catherine, who was away in Malvern, a spa town famous for its healing waters, that their eight-month-old daughter had suddenly fallen ill. He told Catherine to come home to London, but to come in a spirit of “perfect composure,” since, after all,
we never can expect to be exempt, as to our many children, from the afflictions of other parents—and that if—if—when you come, I should even have to say to you, “our little baby is dead,” you are to do your duty to the rest.
In fact, when he wrote that letter, their little Dora was already dead; she had gone into convulsions one day and died before her father could be summoned home from the London Tavern, where he was speaking at the annual dinner of the General Theatrical Fund. Dickens’s best friend, the critic John Forster, was called out of the room and informed of Dora’s death; he decided to let Dickens make his remarks before imparting the sad news. Dickens’s speech happened to be about how actors must play their parts even when they have experienced scenes of suffering and death in their own lives, and Forster later wrote, “I remember to this hour with what anguish I listened.”
In his turn, Dickens decided to refrain from telling his wife immediately. A decade earlier, Dickens had achieved great success with his novel The Old Curiosity Shop, in which the angelic thirteen-year-old heroine, Nell Trent, is cast out into an uncaring world where she suffers, sickens, and dies. Dickens, publishing the novel in installments, had teased out Little Nell’s decline from chapter to chapter for an eager reading public, ultimately giving her a death so steeped in pathos that it still stands as a high-water mark of Victorian sentimentalism. Now, with his own baby suddenly dead, he broke the news to his wife in installments, preparing her for the worst by reminding her of the grim truth that no family “can expect to be exempt” from child mortality, a feeling captured by Henry Wadsworth Longfellow’s poem “Resignation,” written after the death of his daughter Fanny in 1848:
There is no flock, however watched and tended,
But one dead lamb is there!
There is no fireside, howsoe’er defended,
But has one vacant chair!
When I trained in pediatrics at Boston Children’s Hospital in the Eighties, almost no pediatric death was considered inevitable. Even on the scariest wards, with the sickest children, the odds were supposed to be good and getting better. In pediatric oncology, survival rates were higher and higher, we told one another. The most extreme congenital cardiac anomalies were still hard to fix, but the majority of malformed hearts could be repaired, and there was always the possibility of a transplant. Children are not supposed to die—that was the message of the hospital, the promise of the profession. Our teachers told us the story of another presidential child: Patrick Bouvier Kennedy, born in August 1963, the only baby born to a sitting president in the twentieth century. Jacqueline Kennedy went into labor five and a half weeks early, while the family was vacationing in Hyannis Port, and the baby, who weighed a little less than five pounds, was rushed to Boston Children’s, where he died of respiratory distress syndrome—premature lung disease—because, president’s son or not, there was no way to ventilate a premature newborn in 1963.
We remembered this story as we took our turns in the gleaming neonatal intensive care unit, where we routinely intubated and ventilated babies born twelve, fourteen, even sixteen weeks early, babies who weighed less than two pounds: Look how far we’ve come in twenty years! The residents who trained a little later, during the presidency of George H. W. Bush, probably heard about his daughter Robin when they worked on the oncology ward. Born to George and Barbara Bush in 1949, Robin died of leukemia in 1953, at a time when the disease was almost always lethal. By the time her father was elected president, three and a half decades later, the mortality rate associated with childhood leukemia had plummeted. Children are not supposed to die.
“Although dying is a part of life, a child’s death, in a very real sense, is unnatural.” So begins When Children Die, a 2003 manual from the Institute of Medicine. It goes on to say that “the death of a child is a special sorrow, an enduring loss for surviving mothers, fathers, brothers, sisters, other family members, and close friends.” That was a statement characteristic of the new millennium. Until the middle of the twentieth century, that lifelong loss and special sorrow were neither unnatural nor unexpected. Children used to die, and parents knew that losing children was a relatively common and even predictable risk. In 1800, nearly half the children born in the United States died before the age of five. By 1900, between a fifth and a quarter of them did; in 1915, as my grandparents were growing up, one out of every ten infants died before turning one—and there was no way to prevent most of the common infectious diseases of childhood, from whooping cough and pneumonia to scarlet fever and tuberculosis, which regularly killed toddlers and school-age children.
When you start looking in the margins of history for the lost children, they are present in every story, peering out from the edges of family portraits, buried under sad little headstones in old cemeteries. Among the rich and famous, dead children are noted sometimes just as footnotes in biographies. Creating a world in which children are not supposed to die may be our greatest achievement as a species, a victory over thousands of years of suffering, sorrow, and shadow.
At the beginning of the twentieth century, almost a hundred years after Louisa Adams died of dysentery, there were only a couple of ways that parents, whether powerful or penniless, could guard against infant mortality. The nineteenth century had seen public-health advances that raised survival rates for everyone, such as municipal sewer systems and improvements to air quality, ventilation, and garbage collection in crowded cities. But medicine still had nothing to offer a small child who contracted dysentery from microbes in animal milk or drinking water. As the Industrial Revolution drew more poor women into manufacturing jobs in Europe and the United States, infants were often sent to dubious wet-nursing establishments or fed alternatives to breast milk, first animal milk and later baby formula. Thousands of infants died every year in American cities of the so-called summer diarrhea, likely brought on by contaminated milk or water. Pasteurization, which had been developed by Louis Pasteur in the 1860s to prevent the spoilage of wine, would not become mandatory in any American city until 1908, and was not made national policy until long after that.
At the turn of the twentieth century, no one was formally recording rates of infant mortality in the United States, but it is estimated that about 100 of every 1,000 white infants died before their first birthdays, with the number being much higher among black infants—perhaps 170 per 1,000 live births. And many more children died in the early years of life, some from epidemic infectious diseases such as smallpox and diphtheria, or from complications that followed routine infections such as pneumonia after measles.
Reducing infant mortality became a prominent international cause: organizations were founded in France (1902), the Netherlands (1908), and Germany (1909). In the United States, the first meeting of the American Association for the Study and Prevention of Infant Mortality was held at Johns Hopkins University in 1910. The reformers and public-health activists battling infant mortality were not armed with medical miracles; vaccines for common childhood illnesses—other than the one for smallpox—had yet to be developed, though the science of bacteriology was making great strides in understanding the pathology of infectious disease. There was a new awareness that the causes of infant death could be found in a family’s living conditions. A book by George Newman published in England in 1906 defined the issue—Infant Mortality: A Social Problem.
Social problems require social solutions, and many groups took on targeted campaigns. Contaminated or adulterated milk was a major concern; when the Rockefeller Institute for Medical Research was established in 1901, one of the first things it did was survey the distribution of milk in New York City, demonstrating clear connections between contaminated milk and child death. The movement to provide clean milk to the poor at special dispensaries attracted public-health activists and philanthropists, most notably Nathan Straus, an owner of Macy’s, who set up milk stations all over the city and fought for many years to see pasteurization made mandatory. When another future president lost an infant, he reacted by joining the movement. Franklin and Eleanor Roosevelt’s third child, a son born on March 18, 1909, was named after his father: Franklin Delano Roosevelt Jr. He was a robust infant, at eleven pounds, and was immediately registered for Groton. But his health during his first few months was uncertain; he had episodes of rapid breathing and developed a heart murmur. At the end of the summer, he came down with influenza, and on November 1, at the age of seven months, he died of what the doctors called endocarditis. Moved by the experience of losing a child, FDR joined the board of the New York Milk Committee, which was working to bring down the high infant mortality rate in the city’s tenements.
The Roosevelts gave the same name—Franklin Jr.—to another son, born in 1914. He would grow up to play a small but prominent role in the advent of antibiotics. In 1936, at the age of twenty-two, he developed a serious streptococcal throat infection with sinus complications and was treated with Prontosil, an antibacterial drug that had been introduced only a year earlier. His recovery warranted a front-page headline in the New York Times: young roosevelt saved by new drug. And the case did a great deal to publicize these new medications. Only twelve years earlier, in 1924, before the widespread availability of antibiotics, another presidential son, sixteen-year-old Calvin Coolidge Jr., had gotten a blister on his toe while playing lawn tennis at the White House on the last day of June. He went on to develop a fever, along with clear signs of “blood poisoning,” the presence of bacteria in the blood, causing a condition known as sepsis, and died at Walter Reed General Hospital on July 7. His father, who declined to run for a second term in 1928, wrote in his autobiography, “When he went, the power and the glory of the Presidency went with him”—language Biden would echo almost ninety years later.
The rest of the twentieth century would bring immunizations against diphtheria, whooping cough, tetanus, and polio, all of which were often lethal in young children. It would bring a new understanding of how to treat diarrhea, which could (and still can) quickly dehydrate a small body to the point of death; we now routinely immunize children against rotavirus, which remains the most common cause of severe diarrhea in infants and small children. Antibiotics would take much of the terror out of diseases such as scarlet fever and pneumonia. We would make advances in repairing congenital anomalies such as heart defects, in prolonging life for children with genetic problems such as sickle cell anemia and cystic fibrosis, and in bringing down death rates from childhood cancer. In the second half of the century, neonatology made tremendous strides in caring for premature infants. And then there were the safety campaigns, from requiring car seats to encouraging parents to put babies to sleep on their backs. By 1950, infants were dying at a rate of 29.2 per 1,000 live births, and by 2018 it was down to 5.7, though the racial disparities have remained stark; the infant mortality rate among African Americans was 10.8 out of 1,000, while for whites it was 4.6. These days, very few children who live to their first birthday go on to die in early childhood.
Losing a child has become a comparatively rare experience, and one that everyone, including experts in pediatric palliative care, now regards as unnatural and traumatic. Does that make the grief harder to bear? Those who work in clinical venues where children still sometimes die, from pediatric intensive care to pediatric oncology, have become increasingly aware that in a world where losing a child is an unusual event, surviving family members need to be treated as fragile and vulnerable. A 2018 article in the journal Pediatric Critical Care Medicine titled “Caring for Parents After the Death of a Child” begins by describing such a death as “a traumatic experience that can put parents at risk for adverse mental and physical health during bereavement,” and goes on to cite evidence that bereaved parents are at higher risk of a variety of illnesses and even of dying themselves. Mothers face a higher risk than fathers, and those who lose younger children face higher risk than those whose children die as adults.
Modern parents who have lost children often describe feeling a sense of desolate isolation; there is no way for them to mention the dead child or the child’s death in conversation without drawing attention to a tragedy so unthinkable that it crowds out all other subjects. The authors of the 2018 article consider the possibility that the very rareness of childhood death in developed countries may mean that family and friends don’t know how to respond to or help bereaved parents. The Roosevelts certainly grieved their dead son, but they matter-of-factly reused his name, unworried that calling another baby Franklin Jr. might bring up unthinkably sad memories.
The American Psychiatric Association’s Diagnostic and Statistical Manual of Mental Disorders (DSM) used to include something called the “bereavement exclusion.” A recently bereaved person who would otherwise have met the criteria for major depressive disorder would not be given a psychiatric diagnosis, but would be understood to be showing depressive symptoms as a normal part of mourning. The fifth edition of the DSM was published in 2013, and during the revision process, a very public debate erupted over a proposal to eliminate the bereavement exclusion out of concern that dangerous depression after the loss of a loved one could be overlooked. Some experts countered that eliminating the exclusion was tantamount to medicalizing normal grief, and this, of course, raised another question: What is “normal”? As the debate sharpened, in professional publications as well as in the lay press, a grief counselor put forward the idea that even suicidal thoughts can be a normal part of grieving after parents lose a child. In the end, the bereavement exclusion was removed from the DSM-5.
Today, parents feel that if they make all the right decisions, from the right sleep position to the right car seat to the right foods, they can keep their children safe. But these good and valuable steps to ensure children’s safety can leave some parents terrified that somewhere along the way they will make a mistake, a wrong turn. My guess would be that for all their anxieties, the parents of my grandparents’ generation did not live with the same fear that one bad decision could compromise their children’s safety, because they didn’t believe their children could be kept absolutely safe—no child could be. Of course they worried—think how they must have worried over babies with diarrhea, over older children with fevers and sore throats, over polio and diphtheria scares—but it’s possible that in a world where every parent, rich or poor, was subject to those same worries, they did not live with perpetual anxiety.
In our overanxious age, worrying is sometimes now associated with the problem of overparenting. Worried All the Time: Rediscovering the Joy in Parenthood in an Age of Anxiety, a book by the psychologist David Anderegg, was originally published in 2003 with the subtitle Overparenting in an Age of Anxiety and How to Stop It. Anxiety not only gnaws at parents, he writes, but it can get in the way of children’s development.
Whereas grief is defined more and more as mental illness, anxiety, already the most common mental illness, is here construed as a parental failing. Parents need to make the right decision at every turn, check every box on the safety list, follow all medical advice, and avert all possible threats and dangers, but they need to do it without worrying. It’s an impossible assignment. Similarly, pediatricians now live in fear of missing a diagnosis, losing a single child to a preventable disease. When I think about the doctors working a century ago, who saw so many babies die of diarrhea, I am deeply grateful for all our medical advances, but there is a different kind of pressure that comes with practicing in the hope—and the expectation—that if you do everything right, every child will be okay, will go on to outgrow your pediatric practice and move into the realm of adult medicine (where, eventually, deaths have to be accepted).
Children should not die. The voices of parents from the past make it clear that they felt this as strongly as we do, that they loved and mourned and remembered their lost sons and daughters. “Forty years has not obliterated from my mind the anguish of my Soul,” Abigail Adams wrote of her infant’s death. As I read the words of bereaved parents, I found myself thinking about the ways that we now love the older generation. I thought about the way that I loved my mother during the final years of her life, when I knew she was fragile and that I might not have her with me for much longer; when every new problem scared me profoundly, even as I tried to sound confident and reassuring. My fear of losing her did not make me disconnect or protect myself; if anything, it made me cherish her more. And yet if I said to you that the defining event of my life was losing my mother in 2014, and I sometimes think that it was, it’s possible that you would not consider that “normal.” The death of an aged parent is in no way unnatural, in no way a subversion of the proper order of things, and yet the grief it leaves behind is real, and strong, and lasting. In past centuries, parents may have felt similarly about their children.
By those standards, we are now all immensely privileged, parents and pediatricians alike. It used to be that nearly every president had a child’s grave to visit; to have a president today who has experienced this loss is to see an adult whose life has been wrung by a tragedy that many people cannot imagine. The statistics tell us that while there is still important work to be done to achieve equity, we have the power to create a world of security, a world in which the death of children becomes an even rarer tragedy. That safety turns out to bring with it profound obligations for parents, and for everyone who cares for children; in a certain sense, we have altered the course of nature by deciding that every child should thrive, that every parent should be exempt from grief. We no longer expect infants and children to die, and that is a great and glorious sea change in what it means to be a parent, but as with every other aspect of parenthood, this privileged vantage turns out to be accompanied by complex anxieties and doubts.
Paradoxically, there are parents who have come to take that safety so for granted that they have grown cavalier about the great gift of childhood vaccines, or perhaps have decided to be more frightened of vaccines than of diseases, and let their children ride on the back of herd immunity. But the lesson of the battle against infant and child mortality is that the only way to make the world safe for individual children is to work on making it safer for all of them. Living to grow up was not a biological imperative. We have created this entitlement, we clever humans, out of science, sanitation, and medicine, along with public policy and parent advocacy. Perhaps it is only right that it makes us a little anxious, reminding us not to take it for granted, this great achievement, that unlike parents of previous generations, we expect to see our children—all of them—live to grow up.