Do Our Odds of Dying Ever Stop Increasing With Age? Scientists Disagree – The Crux

elderly couple sitting together

(Credit: Pressmaster/Shutterstock)

As we get older, our chances of dying go up. To that, you might say, well, duh. But, when we hit around 80 years old or so, a funny thing seems to happen: Our odds of dying stop increasing and instead start leveling out. So you’ve got the same shot — about 50/50 — of croaking at, say, 110 (an age that would classify you as a so-called supercentenarian) as you would at 95.

It’s an odd phenomenon that’s left experts puzzled for years. Researchers have floated many theories to try and explain it. For instance, some think maybe there’s some evolutionary quirk that allows this to happen. Others posit that maybe there’s some cellular funniness afoot, allowing people of extreme old age to somehow accumulate damage more slowly in their cells.

But some experts contend that maybe, this human mortality plateau isn’t happening at all. A new paper in PLOS Biology argues that most — if not all — of the proposed observations of so-called late-life mortality plateaus could be chalked up to scientific error.

The Illusion of Age Plateaus 

Saul Newman, a postdoc in machine learning and wheat genomics at the Australian National University in Canberra, is the sole author of the report. He argues that the aging plateaus researchers have documented in the past can be explained by a combination of errors related to the data sets they’ve used. Erroneous age reporting is one factor that could skew data. For example, he says in his paper, back in World War I, roughly 250,000 people in the U.K. reported being older than they were in order to join the fight.

But it seems the factors that can distort aging trends the most are the demographics researchers target for their studies and the accuracy of birth and death records they use.

Newman writes that our understanding of aging trends often relies on records from more than 90 years ago, which may not be all that reliable. In fact, even today, there are areas where record-keeping has some catching up to do; more than a third of global births still remain unreported. In some countries, there’s not even enough data to estimate births with any accuracy. Even in developed countries, he argues there aren’t any populations with accurate record-keeping big enough to examine late-life aging trends.

Still, it’s the records from those developed countries that underscore the point he’s trying to make. When you’re looking at populations where record-keeping is the strongest and most robust, the odds of late-life mortality plateaus occurring shrinks.

To demonstrate his claims, Newman used demographic data from international databases — including the U.N.’s World Population Prospects and the Human Mortality Database, maintained by teams from the University of California Berkeley and The Max Planck Institute for Demographic Research — to run his own statistical analysis. Essentially, he took this data and assumed that aging plateaus don’t exist and that instead, our mortality rates continue to increase as we age. Then, he introduced errors, such as ages being recorded incorrectly, into that data. He found that even when error rates were as low as one in 10,000, that was still enough to produce late-life mortality plateaus.

It’s “a blindingly simple explanation,” he says. “Until we have the human equivalent of tree-rings, an unforgeable measure of human age,” he writes in a follow-up email, “these data will remain dubious at best. Without a real, biological metric of human age, we cannot discriminate bureaucratic error from reality.”

A Push for Better Data

But Kenneth Wachter, a demographer and statistician at the University of California, Berkeley, argues Newman’s paper has its limits.

Though he concedes that many of the data sets that have been available can be laden with errors, he says researchers are working to change that. “Newman’s point really applies more to the official statistics that people have had to rely on, before big investments were made in individual age validation,” he says. “The new works that are coming out are really aimed at getting around the kinds of problems that Newman is featuring.”

He points to a study of his own, published in Science in June, which Newman has also criticized in a separate commentary released along with his PLOS Biology paper.

This Science paper relied on records from nearly 4,000 Italians, all of whom were confirmed to be 105 years old or older when the study was being conducted. That data was from the Italian National Institute of Statistics, which required that birth and death certificates be provided for everyone included in the sample. According to Wachter, the way those records were kept means the possibility of clerical errors is low. “Their birth records were kept in local municipalities in bound volumes, one volume for each year. So, a person who appears in the register for 1900 can’t have been born in 1910, the way Newman’s hypothetical model would say. In other kinds of studies, which are based simply on ages that people say they are, you can have the kinds of errors that Newman has hypothesized. Lots of people say they’re older than they are.”

A Fuzzy Future

Their argument underscores the tug of war that’s been ongoing and which has ramped up in recent years.

“It’s a big question,” Wachter says. But he’s optimistic about the future. He says the results that he and his team found “suggest flexibility in lifespans” and that “we haven’t seen all the progress that’s possible.”

Newman, though, is more cautious. “People have had this false dichotomy where they think that, oh there’s a limit, or there’s no limit, and it’s that simple,” he says. But he suspects it’s more nuanced. “There’s not one limit to human life. There’s a different limit to human life depending on where you’re from, what environment you grew up in, what time period you’re from and also what gender you are.”

He also thinks researchers’ time would be better spent digging into typical aging. “I would argue for less research on supercentenarians,” he writes in an email. “It is just unreliable data. That is not to say I think this research is not important. Instead, we must focus … on the ordinary aging of ordinary people.”

Comments are closed.