The year that WI-38’s mother contracted rubella, she was hardly alone. Tens of thousands of babies would be born with the blindness, deafness, and brain and heart defects of congenital rubella syndrome in 1963, and tens of thousands more would die in utero of their mothers’ rubella infections. So WI-38’s mother chose an abortion—perfectly legal at the time in Sweden, where WI-38’s mother lived. Abortion was not widely legal in the United States at that time, but the tens of thousands of babies born with congenital rubella syndrome during this epidemic in the United States changed that. Seeing so many babies die horribly or become devastatingly disabled from an infection they contracted in utero led to popular acceptance of abortion in America, which in turn led to the widespread legalization of abortion there.1
WI-38’s lung cells were used to culture a weakened version of the rubella virus, something that had never been done before. This became the basis for the rubella vaccine. The rubella vaccine was wildly effective, and widely used. By the time my mother was pregnant with me ten years after the 1963-64 rubella epidemic and its tens of thousands of dead babies, there had been just under 12,000 cases of adult rubella and fewer than 500 babies born with congenital rubella in the entire United States in the five year period since the vaccine had rolled out in 1969. By 2020, there were only about 600 babies confirmed to have been born with congenital rubella syndrome per year on the entire planet2.
It is very easy to forget how bad the rubella epidemic with its dead and dying babies was, even if you lived through it. And of course most of us did not live through it. To have memories of the terror of rubella at all you’d have to have been born no later than about 1950, and be 74 years old today. For almost all of us, we are completely ignorant of the world the rubella vaccine built for us, even though we live in that world. Our world is almost entirely free from the fear that a mild respiratory infection in a mother will kill or profoundly and permanently disable her child. We never have to think about that possibility for even a moment.
And so we don’t.
Babies used to die a lot, and not just from rubella. In the ancient world, about 40 - 50% of babies born died as children. Most of this was from disease. But not all of it. Infanticide was commonly practiced in ancient Greek and Roman society, either through exposing unwanted babies to the elements and not caring whether they died, or through directly murdering them. Until a Greek child was named ten days following birth, it was perfectly legal for the father to insist on the baby’s exposure. Illegitimate children, daughters, and disabled babies were most likely to be left to die.3 In ancient Rome, killing disabled babies by drowning was mandated by law.4
Killing babies who were unwanted for whatever reason was made illegal by Constantine, the first Christian Roman emperor, who initially also suggested that the state should help pay for babies’ care when families were too poor to do so, thus to remove the economic incentive for infanticide5. Babies still died, though. In the early 1990s, a massive cemetery for infants dating from about 450 A.D. was discovered outside of Rome. But these babies hadn’t been murdered, or left to die:
…[T]he Christian influence must have been established by then, or people would not have even thought to have a cemetery where newborn children were given proper burials. Since Christians baptized infants and considered them significant humans at least from birth, they could not merely discard the bodies of dead infants or bury them unceremoniously within houses, as had been the earlier Roman practice.6
The babies found in the fifth century Roman cemetery included a range of ages, including toddlers and 22 miscarried fetuses. Researchers think that these babies died of malaria.
Ali Maow Maalin was an adult when he died of malaria in 2013.
He had earlier gained fame as the last human being to naturally contract smallpox, in 1977. His illness with smallpox made him strengthen his commitment to vaccination, the cause he worked on for the rest of his life.7
The eradication of smallpox is one of the most stunning accomplishments in all of human history. It marks the first of only two times in history that humans have made a disease go extinct. The total number of deaths caused by smallpox is difficult to tally, but is in the billions: in just the 20th century, the last century of smallpox’ existence, at least 300 million people died of it. It was airborne, and very infectious: each person infected with smallpox was likely to infect another six people, if they weren’t already immune. For most of human history, if you got sick with smallpox there was about a 1 in 3 chance that you’d die of it.8 Illness from smallpox was painful. In addition to the fever, aches, and pains that accompany any severe viral infection, smallpox gets its name from the pus-filled blisters it causes to form on the skin and other body surfaces.
The global campaign to eradicate smallpox started in 1959, just ten years after smallpox had been eliminated from the United States. Ten years after the global eradication campaign started, there were still tens of millions of cases of smallpox on earth every year. The smallpox eradication campaign was formally declared over in 1980, three years after Ali Maow Maalin recovered, with no new cases anywhere in those three years. Because smallpox relies on humans for its replication, we knew that meant that it was gone for good.9
We attempted something similar with vaccination against polio. That’s what Ali Maow Maalin was working on when he died. The year he was born, 1954, polio had raged around the world, causing epidemics in the United States and Europe that helped motivate the development of a vaccine that came available in 1955. It’s hard for people now to recognize how bad polio was; as with rubella, in Western countries only people at least in their 70s have any memories of this terror. By 2013, when Ali Maow Maalin died of malaria, there were only 416 cases of polio in the whole world.
Last year, about 600,000 people died of malaria. Like in fifth century Rome, the vast majority of those deaths were babies and small children. Malaria was eradicated in Rome in the 1950s—around the same time it was eradicated in the United States, and across Europe and northern Asia. About 95% of malaria cases today occur in Africa, where Ali Maow Maalin was infected and where he died.
There was no vaccine for malaria when Ali Maow Maalin died of it, but there is now. Because malaria parasites are dependent on humans for their reproduction, like smallpox was, humans could theoretically eradicate malaria, too.
Because malaria is potentially eradicable and because it’s so deadly to babies, stopping its spread is one of the favorite causes of effective altruists.
Effective altruism takes its cues from utilitarianism, the idea that it is morally good to increase well-being and reduce suffering, and that everyone should count equally in this calculation. Babies shouldn’t count less just because they’re babies, and babies far away shouldn’t count less than babies closer to me. If all lives are equally valuable, and I can save babies far away with anti-malaria interventions, then I’m morally obligated to do so. For this reason, the top two of effective altruist charity ranker GiveWell’s top four rated charities are aimed at malaria prevention.
But why would you think that all lives are equal, that these babies are worth saving? This is not a natural thing to think. It’s not what the ancient Greeks and Romans thought, and they were hardly alone: throughout human history, cultures that have condoned infanticide have been much more common than those that condemned it. The percentage of stone age infants who were intentionally killed is estimated by archaeologists at between 15 - 50%. The primates to which humans are most closely related routinely kill the babies of rivals, and most mammals kill and consume some of their own babies, at least occasionally. The idea that I might have moral obligations to save babies I’m not even related to is especially weird.
But the notion that all human lives have value, that babies shouldn’t be killed by malaria, rubella, or infanticide no matter where they are is now so deep in our psychology that we never have to think about the possibility that it might not be.
And so we don’t.
Rounding out the list of GiveWell’s top four rated charities is one to increase uptake of childhood vaccines. Over the past few decades, access to vaccines has radically expanded around the world. For instance, the year I was born less than 3% of the world’s children were vaccinated against rubella, and now it’s about two-thirds.10 But vaccination of babies against diseases that can kill them has slipped in recent years11, as the percentage of adults who believe that vaccination is important has declined.
A popular modern argument against childhood vaccination is to note that the diseases that these vaccines prevent were once commonplace childhood events. Since those events have not been experienced by anyone making that argument, it’s easy to imagine that any problems caused by vaccines might be worse than the previously common diseases they prevent. It’s almost impossible to imagine the scale of childhood death there was in that past world.
And so we don’t.
What did they do, we might naively ask, our ancestors before the rubella vaccine? Well, they did what the Romans did: they just let babies die.
I was born into a world built by scientific progress. When I was a baby, I was inoculated against everything we had vaccines for, so I never had to worry about rubella, or measles, mumps, diphtheria, whooping cough, polio, or the smallpox that had been nearly eradicated by the time I was born. I never even had to know that these diseases existed for me to have been kept safe from them by my parents, by the world my ancestors built. A world safe for babies was just part of the infrastructure, easy to take for granted.
Scientific progress is pretty great, and I try not to take it for granted. It’s also pretty new, in a historic sense, which makes it a bit easier to see once you look for it. You only have to go back 100 years to have a world where none of those vaccines existed, where about one out of five babies born in the United States died before turning five. You only have to go back 50 years, to the year I was born, to have a world where one out of 10 of all the world’s babies died as babies.
You have to go back longer to uncover the infrastructure making us value the lives of babies, even the ones far away, in the first place. You have to go back to before Roman Christians built the graveyard for babies who died of malaria in the fifth century A.D. But it’s important to note that that infrastructure wasn’t always there.
Science gets a lot of credit for building the world we have today—as it should. Science, the story goes, swept away the irrationality and superstition of earlier times, allowing humans to finally make the breakthrough discoveries that would allow us to save the lives of babies everywhere. And science did allow us to build an infrastructure to save the babies of strangers. But why did we want to?
You won’t find that answer in science. There’s nothing rational in wanting to save the lives of non-kin, any evolutionary biologist will be quick to explain. It’s in your evolutionary interests that your genes survive, and that your rivals’ don’t. That’s why primates kill each others’ infants. That’s why Romans drowned babies who couldn’t carry on the family line. If you feel a moral obligation towards helpless strangers, that’s just an accident of our adaptive selection to provide for our own helpless babies, to promote one’s own genes. Any evolutionary psychologist can tell you that.
I have been told that religious faith is a similar sort of thing: an artifact of an evolved penchant for storytelling, finding motives and meanings for phenomena. Belief was also evolutionarily advantageous, because attributing reasons to events helped our ancestors survive. But now that we understand that, we can throw off those fetters of belief in God, and make our own meaning.
What is that, though? And does our self-chosen meaning involve us working to save the babies of strangers, or killing them?
Another modern argument against vaccination is that the diseases the vaccines protect against are now rare. An American baby isn’t at much risk of dying of rubella even if her mother were not vaccinated, because it’s unlikely that she’d come into contact with someone who had rubella in the first place.
Of course, the reason rubella is rare is because we vaccinate against it. It has not been eradicated. If we were to stop vaccinating, it would come back. This has happened with other vaccine-preventable diseases as vaccination rates have fallen. Measles was declared eradicated from North America in 2005, but it’s back now. Polio has become resurgent in places it had been eradicated from as polio vaccination campaigns have faltered. I would not want babies to die of congenital rubella syndrome again, so I am pro-rubella vaccination.
I believe vaccines save babies’ lives, but why do I believe in saving the babies of strangers?
Besides making sure I was inoculated against rubella when I was a baby, my parents also inoculated me with an idea. Here it is, in the form of a nursery rhyme:
Jesus loves the little children
All the little children of the world
Red or yellow, black or white
All are precious in His sight
Jesus loves the little children of the world
I believe that all human lives are valuable, and thus that all babies should be saved because the Bible tells me so. I believed this, in the inherent value of all human lives, even during the time when I stopped believing in the Bible. That’s what a good inoculation will do for you.
In medicine, inoculation means training someone’s immune system on aspects of an infectious agent so the immune system will recognize and mount a response to a disease before the person has actually been exposed to it. By analogy, in communications, inoculation means that someone’s first exposure to an idea is much more influential in their understanding of it than a later exposure can be.
In the United States, 28% of adults report being religiously unaffiliated, as atheists, agnostics, or “nothing in particular.”12 This number has steadily risen in the past 20 years, both in the United States and in Europe. It is relatively rare that someone with no religion was raised that way, though. Overwhelming majorities of Western adults who describe themselves as having no religious beliefs were raised as Christians.13 Not that they plan to raise their own children as Christians. Adults who converted to having no religion tend to raise their children in non-religious households, and avoid exposing their children to religious ideas. A first exposure to an idea is an inoculation, molding one’s views for the rest of her life. Children shouldn’t be indoctrinated into a religion when they’re too young to understand it, the argument goes. Rather, they should make their own decisions about their values when they grow up.
But what values will those be, and how will those decisions be made? If values like “the lives of even strangers’ babies should be saved” were obvious or logical, those values would be more universal than they in fact are. If evolutionary theory or everyone’s experience being human teaches us anything, it’s that the universal value we have from birth is selfishness, to privilege our own perspective and interests over others. Other people are as valuable as you are, and you should sacrifice your own interests for their good? That’s not a natural thing to think. That’s something you have to be taught, a belief you must be inoculated with from an early age.
A baby can’t manufacture salutary values for herself de novo any more than she can come up with the perfect antibody to protect herself against rubella without being inoculated. I am eternally grateful that my parents had me inoculated—both with the rubella vaccine and with the Gospel.
Religiously unaffiliated adults make moral decisions using the same criteria as religious ones do, the Pew Research Center reports:
When making decisions between right and wrong, most “nones” say they rely extensively on the desire to avoid hurting people, and on the use of logic and reason.
Overall, 83% of “nones” say the desire to avoid harming other people is extremely or very important to them when making moral decisions, while 82% say the same about the use of logic and reason…..
Like most “nones,” most religiously affiliated Americans cite each of these four considerations – not wanting to hurt people, logic and reason, feeling good when choosing the right thing, and wanting to stay out of trouble – as key factors when making decisions between right and wrong.
What most distinguishes “nones” on this survey question is a lack of reliance on religious beliefs.
Relatedly, the effective altruists urging us to save the life you can generally self-describe as agnostics or atheists, having come to their conclusions about how to most efficiently save the babies of strangers through logic and reason alone.
But while logic and reason might help one find a better way of helping others, it cannot provide the premise that one should. The belief that other lives count as much as one’s own, and that you should sacrifice your self-interest to help others is not the conclusion of a set of logical arguments, it’s the premise on which the argument is based. Since it’s not one that’s intuitively obvious, it’s worth thinking hard about where that premise comes from before we go knocking down its infrastructure.
And I can’t help but notice that, whatever their adult beliefs, almost every utilitarian, effective altruist, and global campaigner for childhood vaccines was inoculated with Christian ideas about the intrinsic value of all human lives as a child14—just like I was.
We don’t vaccinate against smallpox anymore, because it was eradicated. In nature. But it wasn’t in the rational self interest of the world’s two superpowers at the time, the United States and the Soviet Union, to give up their remaining stocks of smallpox virus. So vials of smallpox were kept in labs in both the Soviet Union and the United States. And that’s where they remain, so far as we know.
Rates of childhood vaccinations for diseases including rubella continue to fall in Western countries. Polio vaccinations were recently suspended entirely in Afghanistan, one of the last two countries with endemic polio.
People who were themselves raised with religious ideals about the value of human life are increasingly choosing to raise their own children without religion, increasingly insisting on an absence of religious influences in the public square.
But the nation of Cameroon plans to vaccinate a quarter of a million children against malaria in the coming year15, and our small church has several children in first communion class. I also will do my best to help maintain this neglected fence.16
What will the world look like, if we stop inoculating babies against infections, and stop inoculating them with the idea that all babies have value? It’s hard to imagine the possible consequences of disabling an infrastructure we’ve never had to consider in our lives before.
And so we don’t.
This essay is heavily influenced by Tom Holland’s Dominion, an eye-opening analysis (by an atheist) of the Christian infrastructure of “secular” Western ideals including universal human rights. I am also indebted to
at and to at , especially for their respective essays “Fish in Water” and “Please Protect Wisdom From Midwits” for inspiring the line of thinking that led to this essay. Thank you also to at for her essay on declining trust in vaccines.Allen, Arthur. (2007) “Mumps, Rubella, and Abortion Liberalization,” in Vaccine: The Controversial Story of Medicine’s Greatest Lifesaver, W.W. Norton and Company. pp. 232-240.
Zimmerman et al. (2022) “Progress Toward Rubella and Congenital Rubella Syndrome Control and Elimination — Worldwide, 2012–2020,” Morbidity and Mortality Weekly Report 71(6);196–201. See table for details on confirmed global cases of rubella.
Patterson, C. (1985) “Not Worth the Rearing: The Causes of Infant Exposure in Ancient Greece,” Transactions of the American Philological Association, 115: 103-123.
Harris, W.V., (1994) “Child-Exposure in the Roman Empire,” The Journal of Roman Studies, 84: 1-22.
Bennet, H. (1923) “The Exposure of Infants in Ancient Rome,” The Classical Journal, 18 (6): 341-351. This article also notes that among the reasons Roman parent would be legally required to drown a baby is the baby’s being “incertus mas an femina,” i.e., that the baby was intersex.
Wilford, John Noble. (July 26, 1994) “Children's Cemetery a Clue To Malaria as Rome Declined,” The New York Times, section C, p. 1.
“Remembering Ali Maalin,” (2018) World Polio Initiative.
There are exceptions, dependent on one’s inborn resistance to poxvirus infections. For instance, it’s estimated that the case-fatality rate for smallpox among indigenous American populations was about 90%. A more benign variant of smallpox, Variola minor, killed only 1 - 2% of those it infected, but that strain did not exist until the early 20th century.
For a nice history of this written by some of the smallpox eradication campaign’s key designers, see Fenner et al., (1988). “The Intensified Smallpox Eradication Program” in Smallpox and Its Eradication, World Health Organization.
Our World in Data’s Vaccination page is a great source of information on global vaccination rates.
United Nations Children’s Fund. (2023) The State of the World’s Children 2023: For every child, vaccination. See figure 3 for visual summary of data on decline in confidence in childhood vaccination since 2020.
Pew Research Center (2024). “Religious ‘Nones’ in America: Who They Are and What They Believe.”
Burge, Ryan (2019). “Were Nones Raised by Nones?” on Religion in Public (blog)
Or at minimum benefited from the herd immunity effects of growing up in such a culture. One notable exception is utilitarian philosopher Peter Singer, who was raised in a non-religious household, and (in)famously does think that infanticide is acceptable and sometimes morally required—on similar grounds as ancient Roman legislators did.
Amani et al. (2024). “Introduction and rollout of malaria vaccines in Cameroon and Burkina Faso: early lessons learned,” The Lancet Global Health, 12:5e740-e741.
G. K. Chesterton wrote in 1929, “There exists in such a case a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, ‘I don’t see the use of this; let us clear it away.’ To which the more intelligent type of reformer will do well to answer: ‘If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it.’” This is the idea of Chesterton’s Fence, that we shouldn’t be too quick to destroy a piece of infrastructure—physical or cultural—without being clear that the problem it was created to solve has, in fact, been solved and the piece of infrastructure is really no longer needed.
Fantastically well written and researched!
You are right to liken the inoculation against disease with the religious inoculation against cold self-interest. I believe we are to promote the welfare of others without actively sacrificing our own. I'm grateful to live in a time when we aren't plagued by such illnesses.