Even Noah Webster, that master of words, did not have a name for the terrible sickness. “In May 1735,” he wrote in A Brief History of Epidemic and Pestilential Diseases, “in a wet cold season, appeared at Kingston, an inland town in New-Hampshire, situated in a low plain, a disease among children, commonly called the ‘throat distemper,’ of a most malignant kind, and by far the most fatal ever known in this country.” Webster noted the symptoms, including general weakness and a swollen neck. The disease moved through the colonies, he wrote, “and gradually travelled southward, almost stripping the country of children….It was literally the plague among children. Many families lost three and four children—many lost all.” And children who survived generally went on to die young, he wrote from his vantage point of more than half a century later. The “throat distemper” had somehow weakened their bodies.
In 1821, a French physician, Pierre Bretonneau, gave the disease a name: diphtérite. He based it on the Greek word diphthera, for leather—a reference to the affliction’s signature physical feature, a thick, leathery buildup of dead tissue in a patient’s throat, which makes breathing and swallowing difficult, or impossible. And children, with their relatively small airways, were particularly vulnerable.
Throughout the 18th and 19th centuries, diphtheria challenged doctors with the terrible specter of children choked, smothered, snuffed out. It brought terror to the richest and the poorest, blighting famous families and anonymous ones. Queen Victoria’s daughter, Princess Alice, died of diphtheria in 1878 at the age of 35. Five of Alice’s children had also been sick with the disease, along with her husband, the Grand Duke of Hesse-Darmstadt; their youngest child died. The tragedy prompted the Sanitary Journal to warn readers of the “kiss of death” that had most likely spread the disease through the royal family: “The greatest care and thoughtfulness should be exercised in these cases of simple sore throat, as in the severer cases; and it should be constantly borne in mind that the kissing of children at such times is most dangerous.”
While there was some understanding of how the illness spread—by what we would now call respiratory droplet, through coughing or sneezing or kissing—the actual, underlying cause was not yet known. In the meantime, it was a leading cause of death for children around the world. “Diphtheria contributed to that notion that childhood was not a safe time, that many children would die by the age of 10,” says Evelynn M. Hammonds, a professor of the history of science and African and African American studies at Harvard and the author of Childhood’s Deadly Scourge, a chronicle of early efforts to control the disease in New York City.
Then, toward the end of the 19th century, scientists started identifying the bacteria that caused this human misery—giving the pathogen a name and delineating its poisonous weapon. It was diphtheria that led researchers around the world to unite in an unprecedented effort, using laboratory investigations to come up with new treatments for struggling, suffocating victims. And it was diphtheria that prompted doctors and public health officials to coordinate their efforts in cities worldwide, taking much of the terror out of a deadly disease.
In my more than 30 years as a practicing pediatrician, I have never seen a single patient with diphtheria. That’s because vaccination efforts in this country have been so successful. In the 1980s, when I was training, there were only a few cases a year in the United States. Since 2000, there have been only six reported cases in the U.S.
And yet, the diphtheria story isn’t over. A recent analysis led by a researcher at the Centers for Disease Control and Prevention noted some 8,800 cases reported overseas in 2017. In places where people aren’t getting vaccinated, or are slacking off on booster shots, diphtheria is finding its way back. And the standard treatment, little changed in more than a century, is in short supply.
I was inspired to become a doctor partly by Paul de Kruif’s 1926 book, Microbe Hunters, a thrilling, even swashbuckling adventure about the encounters between humans and microbes. Among other things, it describes the French scientist Émile Roux, who had been Louis Pasteur’s assistant, and the German scientist Emil von Behring trying to find a way to keep diphtheria from killing children in the 1880s in Paris: “The wards of the hospitals for sick children were melancholy with a forlorn wailing; there were gurgling coughs foretelling suffocation; on the sad rows of narrow beds were white pillows framing small faces blue with the strangling grip of an unknown hand.”
One of the doctors who walked those wards in New York City around the same time was Abraham Jacobi, often called the father of American pediatrics. A Prussian-born Jew educated at the University of Bonn, Jacobi founded the first free clinic for children in New York City and in 1860 was appointed the first professor of pediatrics at New York Medical College. He was interested in research-based pediatrics and patient-oriented medicine, as well as in what we would now call the social determinants of health—the ways poverty and family circumstances and other realities of children’s living conditions shape their well-being. It’s partly because of Jacobi that this awareness has been included in U.S. pediatric training and practice for more than 100 years.
In January 1860, at a meeting of the New York Academy of Medicine, Jacobi reported seeing 122 children with diphtheria at the Canal Street Dispensary, though other doctors had reported seeing none. Some doctors might have been misdiagnosing diphtheria as a form of croup—a disease we now know as a relatively common and far less deadly infection of the airway.
Twenty years later, Jacobi put his vast clinical experience into A Treatise on Diphtheria, in which he described how he himself “became affected with diphtheritic pharyngitis followed by a tedious catarrh, consequent upon sucking the wound, during the performance of tracheotomy, in an eight-year-old child.” Almost all of his more than 200 attempts at tracheotomy—cutting the neck to open the windpipe—ended in failure. The only reason he made this last-ditch surgical effort was “the utter impossibility of witnessing a child’s dying from asphyxia.”
Jacobi was married to another doctor, the brilliant Mary Putnam, who had trained at the Female Medical College of Pennsylvania and then at the École de Médecine at the Sorbonne, where she was the first-ever female medical student. The couple had a son and a daughter, Ernst and Marjorie. In 1883, both came down with diphtheria. Jacobi would later tell a story about a family resembling his own, blaming the infection on the “trustworthy nurse.” Scholars have speculated that Jacobi may have been unable to face the possibility that he himself may have brought the infection home from the clinic. Marjorie recovered, but Ernst died, at the age of 7. There was nothing doctors could do, even for their own children.
Jacobi was skeptical of the idea that diphtheria was caused by any particular bacteria. But in 1883, the same year Ernst died from the disease, the Prussian pathologist Edwin Klebs found a bacterium lurking in the leathery tissue, known as a pseudomembrane, that can block a patient’s airway.
Friedrich Loeffler, a German bacteriologist, took this microbe and grew it in the lab, to solve the mystery of whether it was indeed the cause of the disease. He followed a set of rules laid down by Robert Koch, one of the founding fathers of bacteriology. To establish that a micro-organism causes a disease, “Koch’s Postulates” state that you must show: It is present in every case; it can be grown in a laboratory; the lab-cultured organism can cause the disease in a new host; the micro-organism can again be isolated from that new host. (My medical school had us memorize Koch’s Postulates even though we knew by then that they didn’t apply to every type of infection.) Loeffler infected guinea pigs, rabbits, horses and dogs with his lab-grown cultures. The bacterium came to be known as the Klebs-Loeffler bacillus (later, Corynebacterium diphtheriae).
Then in 1888, Roux and Alexandre Yersin, medical doctors at the Institut Pasteur in Paris, took another big step when they showed that a substance secreted by the bacteria was the particular culprit. In the lab, researchers grew the bacteria bathed in a broth; after siphoning off the fluid and filtering it to remove any cells, Roux and Yersin found that the fluid contained a potent toxin. Small doses of the diphtheria toxin could do great damage in susceptible animals. So the scientists mixed the toxin with an iodine solution, which made it far less deadly.
Another step: Behring, working with Shibasaburo Kitasato, a Japanese bacteriologist, discovered that weakened tetanus toxin, given repeatedly to experimental animals, prevented those animals from developing symptoms after they were exposed to tetanus bacteria. The toxin had spurred the animals’ immune systems to recognize and fend off the invading bacteria. Moreover, when lab workers took blood from those immunized animals and removed the blood cells, the remaining serum contained antibodies to tetanus that, when injected into other animals, provided immunity to those animals, too. Behring applied this same principle to diphtheria, creating a serum that could be used to combat the disease in humans. He was recognized for this work in 1901 with the first ever Nobel Prize in Medicine.
The antitoxin wasn’t a drug that would kill an infecting microbe. The first antimicrobial drug, Salvarsan, which worked against syphilis, was discovered in 1909, and antibiotics like penicillin, which worked on many infections, didn’t become available until decades later. And the antitoxin wasn’t a vaccine that would activate the patient’s own immune system. But antitoxin for diphtheria was bacteriology’s first great weapon, a technique for borrowing products made by another immune system—antibodies that would hang around in the patient’s blood long enough to battle the infection.
To make large quantities of this life-saving therapy, Roux and two colleagues, Louis Martin and Auguste Chaillou, relied on horses, which produce copious amounts of serum. In Paris, they injected horses with weakened diphtheria toxin. They waited for the animals to produce antibodies in response, then bled the animals and collected the serum. From February to July 1894, at the city’s large Hôpital des Enfants-Malades (or Hospital for Sick Children), Martin, Roux and Chaillou administered horse serum containing antitoxin to 448 children suffering from diphtheria. Just 109 of them died, giving a fatality rate of 24.3 percent. Meanwhile at the pediatric Hôpital Armand-Trousseau, where the serum was not used, the fatality rate was 60 percent.
Roux presented these results at the International Congress of Hygiene and Demography in Budapest in 1894. One American doctor later wrote that he had never before seen “such an ovation displayed by an audience of scientific men….Hats were thrown to the ceiling, grave scientific men rose to their feet and shouted their applause in all the languages of the civilized world.”
For any child sick with diphtheria at the very end of the 19th century, the key question was whether the antitoxin would be available. It came to New York City almost immediately. Hermann Biggs, chief inspector of pathology, bacteriology and disinfection at the New York City Board of Health, learned about the antitoxin during a trip to Europe in 1894, and he cabled a colleague to start making serum. When the city wouldn’t provide immediate funding for horses and equipment, Biggs and a colleague, T. Mitchell Prudden, put up some of their own money, and the New York Herald raised funds in a subscription campaign. The horses were stabled at the New York College of Veterinary Surgeons on East 57th Street. Within a year, New York City had given 25,000 doses of antitoxin to patients.
But the therapy was distributed unevenly in the United States when the young son of W.E.B. Du Bois got sick. Du Bois, the historian and activist who had been the first African American to earn a doctoral degree at Harvard, left Philadelphia in 1897 for an academic job in Atlanta. In 1899, his 2-year-old son, Burghardt, came down with diphtheria symptoms. In Du Bois’ classic 1903 book, The Souls of Black Folk, he wrote about his child’s death. “And then one night the little feet pattered wearily to the wee white bed, and the tiny hands trembled; and a warm flushed face tossed on the pillow, and we knew baby was sick,” he wrote. “Ten days he lay there,—a swift week and three endless days, wasting, wasting away.”
The night before Burghardt’s death, his father had gone looking for a Black doctor, assuming that no white doctor in Atlanta would treat a Black child. But he was unable to get treatment for his son. Du Bois’ wife, Nina, believed that if the family had stayed in Philadelphia, the child would have survived. His parents chose to take his body back to Great Barrington, Massachusetts, where Du Bois had spent his own childhood. As Du Bois wrote, “We could not lay him in the ground there in Georgia, for the earth there is strangely red; so we bore him away to the northward, with his flowers and his little folded hands.”
The case is so well known in public health circles that a couple of physicians recently revisited the question of whether diphtheria antitoxin was in fact available in Atlanta at the time of Burghardt Du Bois’ death. In a 2015 article in the Journal of the National Medical Association, Robert Karp and Bobby Gearing drew on newspaper accounts and other sources and reported that at least one Atlanta physician—J.A. Summerfield, who was white—had apparently received a shipment of antitoxin, from France. If there was any antitoxin for diphtheria in Atlanta in 1899, the journal authors wrote, it would have been available only to Summerfield’s patients. There would have been some chance of getting the antitoxin in Philadelphia, where a physician named Edwin Rosenthal was providing the therapy at a clinic that promised equal access without regard to race, creed or national origin. Still, Philadelphia’s public health service was faulty and its diphtheria death rates were high. “Burghardt Du Bois’ chance for survival,” the article concluded, “would have increased many-fold had the family lived in Boston or Berlin.”
Deaths from diphtheria dropped dramatically in places where the antitoxin was most available and the public health infrastructure was most efficient: cities like Berlin, Paris, New York, Chicago and Denver. In some cities, leaders were working to make bacterial diagnosis and treatment available to all. The New York Board of Health also placed quarantine placards on tenements in which diphtheria (or measles or scarlet fever) appeared; as Hammonds, the historian, points out, the signs had the effect of making these infections much more visible, which perhaps helped stop the spread of the disease, but also, to some extent, stigmatized the people living in those buildings.
The New York Herald and the New York Times chronicled the dissemination of this new therapy, and also argued in editorials that it should be administered by public health officials, not by private doctors. The Herald said the therapy would save thousands of human lives, “especially the lives of the little ones of the poor, who have always been shining marks for the dread darts of this most fatal of scourges.”
Yet the antitoxin couldn’t save everyone. In 1904, former President Grover Cleveland and his wife, Frances, lost their daughter Ruth, a popular figure known affectionately as Baby Ruth, to diphtheria at the age of 12, though she’d received the antitoxin the day before. The therapy provoked severe side effects in many children, who developed fevers, rashes or pain and swelling of the joints—reactions to other substances in the horse serum besides the protective antibodies.
Significantly, the antitoxin was not the same as the inactivated toxin that would later be included in the vaccine. It did not prompt a child’s own immune system to make antibodies, but instead transferred those antibodies made by the horse. A child infected by diphtheria and successfully treated with horse serum could later contract the infection again. So when von Behring (whose earlier contributions had earned him the noble “von” before his last name) developed a vaccine against diphtheria, the work was hailed as major progress. His vaccine had two components: diphtheria antitoxin, which could battle an active infection, and also an inactivated version of the toxin produced by the bacteria. Since the vaccine didn’t include any actual bacteria, it couldn’t cause a diphtheria infection. But exposing patients to the toxin, in weakened form, stimulated their immune systems to make long-lasting antibodies.
Within several years of von Behring’s achievement, massive pediatric immunization programs were underway in New York City. A pediatrician named Bela Schick also developed a test (similar to the TB skin tests still used today) in which the doctor injected a tiny amount of diphtheria toxin into the skin. A person who had not been previously exposed to diphtheria, and thus had no immunity to it, would develop a red bump at the site. A person who already carried antibodies to diphtheria would not react. The skin test would prove useful in screening patients for vaccination, as shown by New York City health official William Hallock Park’s study of 180,000 New York City schoolchildren. Half of them were administered the Schick test, and children who showed a reaction—they were not already immune—received the vaccine. The others—not tested, not vaccinated—developed four times as many cases of diphtheria.
With funds from the American Red Cross, and later with extensive support from the Metropolitan Life Insurance Company, diphtheria immunization continued. In New York City, school nurses were key in these campaigns. Public health authorities provided information in Yiddish, Italian and just about every other language that would help the vaccine reach immigrant communities. By 1929, the city was also opening diphtheria immunization stations in parks.
Perhaps the most famous episode in the battle against diphtheria played out in the Territory of Alaska. Curtis Welch, the only physician in Nome, was aware that native Alaskans had little or no immunity against diphtheria. He ordered antitoxin for his patients in 1924, but the local harbor froze before a ship could deliver the treatment.
By January 1925, he was seeing children suffering from diphtheria. There had to be a way to bring in the anti-serum; 300,000 units were at a hospital in Anchorage waiting to be delivered. Illustrating just how far authorities were willing to go to distribute the vaccine, Welch arranged for the vials to be carried by train part of the way, and to cover the remaining 674 miles, Gov. Scott Bone planned a sled dog relay, involving 20 teams of dogs and their drivers. As the world followed their journey via newspapers and film reels, the heroic dogs carried the metallic cylinder containing the vials of antitoxin. People everywhere cheered on the men guiding the teams across the frozen landscape. One of the dogs was the subject of Togo, a 2019 Disney movie, and another, Balto, is immortalized in a much-beloved statue in New York’s Central Park—a noble bronze tribute to an extraordinary adventure in the prevention of human suffering.
The groundbreaking campaigns of the 1920s and ’30s evolved into a universal program of infant vaccination in the United States. A DTP vaccine, created in the 1940s, combined diphtheria and tetanus toxoids with an inactivated version of the bacteria that cause whooping cough (pertussis). Today’s DTaP vaccine still contains inactivated diphtheria and tetanus toxins, and it has been reformulated to include proteins from pertussis, but it includes no actual bacterial cells. The vaccine generates immune reactions that protect against all three diseases.
The CDC recommends that children receive DTaP shots at 2, 4, 6 and 15 months, and between 4 and 6 years old. (A booster shot at 11 or 12 involves a different vaccine called Tdap, formulated for older people, which can then be given every ten years.) Largely as a result of routine, low-cost vaccination, diphtheria is rare in the U.S. The WHO makes recommendations similar to the CDC’s, and public officials in most nations urge parents to get their children vaccinated. Despite all the progress preventing and treating the disease, diphtheria has not been eradicated and still flares up around the world, according to a recent analysis of cases by Kristie Clarke, a CDC epidemiologist. She counted almost 9,000 diphtheria cases globally in 2017. Outbreaks tended to occur in places destabilized by population migration and political strife—she cited Bangladesh, Yemen, Nigeria and Venezuela. Diphtheria emerges, she told me, “when anything disrupts routine vaccination.” And the disease is still a killer; the mortality rate usually cited is 5 to 10 percent, but fatalities can be especially high in areas where medical care is not available. A 2011 outbreak in Nigeria had a case fatality rate of almost 43 percent in children 4 and younger.
For those who do get the disease, diphtheria antitoxin is still a mainstay of treatment, but Clarke told me the antitoxin is in short supply globally. Strangely, the technology of producing the antitoxin has not changed much: It is still made by injecting horses with weakened diphtheria. Michael Hust and Esther Wenzel, medical researchers at the Technische Universität in Braunschweig, Germany, are trying to change that. Their work involves developing a recombinant antibody molecule—building it genetically in the laboratory and amplifying it through cloning, rather than infecting animals and letting their immune systems do the work. The laboratory-made antibody is designed to attack the diphtheria toxin. And if all goes well it will have fewer side effects than the horse-derived vaccine because the new medicine will be what Wenzel called “a fully human product,” based on antibodies originally manufactured by human cells, reducing the chance the body will react to it as a foreign substance. “In an optimal world, we would all be vaccinated, we would not need these antibodies, but you have a lot of outbreaks in different parts of the world,” Hust told me. In Europe, he said, the antitoxin still sometimes has to be rushed from one country to another and arrives too late.
As with many vaccines, the initial infant series of diphtheria vaccinations is not enough to confer robust lifelong immunity, so children and even adults may become susceptible to the disease if physicians and health officials neglect to administer boosters. Clarke’s work helped the World Health Organization develop new guidelines, emphasizing the importance of the boosters.
At a time when so many Americans are distrustful of vaccines, I often think about the talks I used to have with parents in the 1990s. We were still using the old DTP vaccine, which meant children sometimes experienced side effects, especially fevers and sore arms. The discomfort was not nearly as terrifying as the diseases it inoculated against, but parents had no firsthand experience with the diseases themselves, thanks to years of successful vaccinations. My challenge was to help them understand that when they got their babies vaccinated, they were doing their part in a great triumph of human ingenuity and public health. The whole point was to keep those babies safe.
In a Canadian journal article from 1927, a doctor recalled the years before the antitoxin was available, when he’d had to watch a “beautiful girl of five or six years” choke to death. Later, the doctor’s own daughter came down with diphtheria, but a decade had passed and now the antitoxin was available. “To watch the choking dreadful membrane melt away and disappear in a few hours with complete restoration to health within a few days,” he wrote, “was one of the most dramatic and thrilling experiences of my professional career.”
As science and medicine move forward, vaccines and treatments allow parents—and physicians—to care for children without dreading some of the most terrifying infections of the past. Remembering these success stories can help us maintain a feeling of awe, gratitude and willingness to do our part.