Categories
Writings

Expectations of Life & Death

What is different today is that living to grow old has become a reasonable expectation, something we can almost take for granted, rather than a matter of luck.

The days of our years are threescore years and ten; and if by reason of strength they be fourscore years, yet is their strength labour and sorrow; for it is soon cut off, and we fly away.

Psalm 90:10

What it means to grow old has changed enormously, within a handful of generations, yet not in the way that we tend to assume.

The headline figures are startling: no country in the world today has a lower life expectancy than the countries where life expectancy was highest in 1800. A baby born that year in Sweden could expect to live to the age of 32; a descendent of that baby, born in the same country today, can expect to live to 82.

What is commonly misunderstood is the nature of the change behind these figures. They seem to suggest a world in which to reach your early thirties was to be old, in the way that someone in their early eighties would be thought of as old today; a world in which life was truly ‘nasty, brutish and short’. Yet the reality is that the age at which a person is thought of as old has changed relatively little from century to century, even as far back as biblical times, when the psalmist could lament the brevity of human life which stretches to 70 or 80 years. What is different today is that living to grow old has become a reasonable expectation, something we can almost take for granted, rather than a matter of luck.

The reason for clarifying this distinction is not to downplay the extraordinary achievement represented by the increase in life expectancy at birth, but to seek to understand it better. This matters, not least, if we want to think clearly about the promises and claims being made today in the name of life extension. To do so, we need a subtler feel for statistics and also for the cultural assumptions that shape our understanding of death.

Among the contradictory tendencies that make up our culture, there is a habit of treating the fruits of measurement and calculation as revealing an underlying reality that is ‘truer’ than the deceptive evidence of our senses. It may be more helpful to think of the results of quantitative labour as the traces left by reality: footprints in the sand, clues in need of interpretation.

If the figures of life expectancy at birth are one set of footprints left by the lives our ancestors led, another trail of clues is found in the measure of the modal age at death. This tells us at what age it has been most common to die, a slightly different question to the average length of life, and one that takes us closer to the experience of growing old in a particular time and place.

In England, reliable records don’t stretch back quite as far as they do in Sweden, but it is possible to pick up the trail in 1841, when life expectancy at birth was a little over 40. In the same year, the modal age of death was 77 for women and 70 for men.

Over the following century and a half, these ages would go up to 88 and 85, respectively: a significant increase, but not of the same order as seen in the more commonly cited figures for life expectancy.

What is going on here? Why do these two ways of tracing the changing patterns of death tell such different stories? Part of the answer is that the figures for modal age at death ignore all deaths before the age of 10. Until relatively recently, the age at which it was most common to die was zero: a significant proportion of those born never made it past the first weeks and months of life. The decline in infant mortality is not the only factor in the changing of our expectations of life and death, but it is a large one, and it separates the world in which we now live from the world as our ancestors knew it.

What grounds could there be for leaving aside the great swathes of death in infancy and early childhood? Clearly, they must be part of any attempt to form a picture of what age and dying have meant through time, but there are reasons for treating them separately from death in adult life. The first is that it is their inclusion in the averages of life expectancy which creates the misleading impression of a world in which old age began in one’s early thirties. The second is that the causes of death in infancy are different to the causes of death in adult life.

Broadly speaking, it makes sense to think of a human life as falling into three phases: the vulnerability of the first years gives way to the strength of adulthood, then after five or six decades, this strength gives way in turn to the frailty of age. In each of these stages, we are less likely to die in a given year than were our ancestors, but the things that are likely to kill us are different and so are the factors that increase our chances of survival.

Along with the idea that our ancestors could expect to die in their thirties, perhaps the most common misconception about the changing nature of age and death is that it is the result of advancements in medicine. While medical technologies and interventions have played a part, it is not the leading one. Of the 30 years increase in life expectancy that took place in the United States during the 20th century, only five years could be attributed to medical care: the remaining 25 years were the result of improvements in public health.

This is good news. Compared to medical procedures and drug treatment programmes, public health measures tend to be cheaper and therefore reach those who do not have access to highly-trained medical staff. What is more, while medical treatments frequently come with negative side-effects, improvements in public health tend to correspond to broader improvements in quality of life for the individual and society. A recent project in the north-east of England saw the National Health Service paying to insulate the homes of people with chronic health conditions, a move which could be justified in terms of the savings from reduced hospital admissions among the group.

The benefits of clean water and sanitation are particularly important to increasing the chances of survival in the vulnerable first years, whereas the benefits of advanced medical treatments are more likely to add years to the end of our lives. The importance of public health explains why increases in life expectancy have spread far beyond the reach of highly-equipped hospitals. The most striking example is the Indian state of Kerala, where the average income is three dollars a day, yet life expectancy and infant mortality rates are close to those of Europe and the United States.

Such examples matter because they can bring into question the ways in which the future is usually framed. Among these is the tendency to present it as a choice: either we find a way to sustain and extend the way of life taken for granted by roughly one in seven of the people currently alive, with its technological and economic intensity, or we lose this way of life and fall into a Hobbesian nightmare. The Kerala story is complex, but among other things it is a clue that there are more futures available than we are often encouraged to think about.

Death is a biological reality, a hard fact that lies in front of all of us. It is also deeply cultural, entangled with and inseparable from the stories we tell about ourselves, the world and our place within it.

In the 1960s, the sociologists B.G. Glasser and A.L. Strauss identified two contrasting attitudes to death in American hospitals. Among one set of families, mostly recent immigrants, the approach of death was time to leave the hospital so that one could have the dignity of dying at home according to custom; for another group — those ‘more involved in modernity’, as the historian Philippe Ariés puts it — the hospital has become the place where you come to die, because death at home has become inconvenient. Much could be said about these two attitudes, those who ‘check out’ to die and those who ‘check in’, but it is hard to reduce them to a simple trajectory of historical progress in which the modern approach renders the older traditions conclusively obsolete.

Life expectancy — and death expectancy, for that matter — is good ground from which to think about the ideology of progress. It is hard to imagine anyone who would dispute that the improved life chances of the newborn represent an unqualified good. And at this point, I must disown any pretence at detachment: as I write this, I am thinking of my son, who was born nine weeks ago. I can be nothing other than thankful at the good fortune that he was born into a world — and into a part of the world — where childbirth no longer carries a significant likelihood of death for mother or baby, and where the conditions, the knowledge and the facilities are present such that we can almost take it for granted that he will make it through the vulnerable first months and years of life.

Having acknowledged this, what else could there be to say? Except that, as we have already seen, when the great changes in infant mortality are compounded into a single vector of improvement in life expectancy, the result tends to give us a misleading picture of the relationship between our lives and the lives of our ancestors. In the same way, the problem with the ideology of progress is that it requires the reduction of the complex patterns of change from generation to generation into a single vector of improvement, and the result is similarly misleading.

This may come into focus, if we begin to think about life extension, a proposition around which bold predictions and promises are currently made. Those who foresee a future in which human life is measured in centuries rather than decades often appeal to the historical statistics of life expectancy, as if the offer they are making is a natural extension of a process that has already been under way for generations.

Yet, as we have seen, this is based on a misunderstanding of what lies behind those statistics. 80 is not the new 30 — and if someone wishes to convince us that 200 will be the new 80, they cannot call on trends in historical life expectancy as evidence for this.

In fact, it is not clear that the possible duration of human life has been extended. The existence of an upper limit to the human lifespan is a matter of dispute among those who study this area. (Those who study human bodies seem to be more inclined to believe in such a limit than those who study statistics.) It is true that there has been an upward movement in the age of the oldest attestable human over the past two centuries, with the record held by Jeanne Calment, who died in France in 1997 at the age of 122.

However, while Calment’s case is considered exemplary in terms of the documentary proof available, attesting the age of the extremely old remains difficult in many parts of the world, even today, and in earlier historical periods, absence of evidence cannot simply be taken as evidence of absence.

What can be said more confidently is that almost all of the increase in longevity that we now take for granted consists of a shift in the distribution of death within historically-known limits. It has not been unusual for some individuals within a community to live into their late 80s; what is new is that living into one’s late 80s is becoming the norm in many societies.

Changes in infant mortality may represent an unqualified good, but when the strength of adulthood gives way to the frailty of age, the changes in what we can expect may be more open to dispute.

To generations of doctors, pneumonia was known as ‘the old man’s friend’, a condition that tends to leave the healthy untouched, but offers a relatively peaceful death for those who are already weakened. This expression reflects the idea that there is such a thing as a time to die, rather than the role of medicine being always to sustain life at all costs. Today, pneumonia in the very old is fought with antibiotics. Meanwhile, 40% of those aged 85 or over are living with dementia. Our culture can still talk about an ‘untimely death’, but the idea that death is sometimes timely is harder for us to acknowledge. To anyone who has watched a person they love pass into the shadow of Alzheimer’s disease, the question can arise, whether there is indeed a time to die — and whether our way of living increasingly means that we miss that time, living on in a state that is neither life nor death.

To such thoughts, the answer will come: we are investing great amounts of money and talent in the search for a cure to Alzheimer’s.

And, for that matter, in the search of a cure for ageing and a cure for death.

If I were to claim that these goals are unattainable, I would be exceeding the bounds of my knowledge. Instead, to those who seek them, I would make two suggestions.

First, as I have tried to show, the search for life extension is not the natural continuation of the trends that have led to increased life expectancy over the last handful of generations. The bulk of the achievements in life expectancy have been the result of public health improvements, rather than high-tech medicine, and their overall effect has been to increase the likelihood of growing old, rather than change the definition of what it is to have grown old.

Secondly, it seems to me that the pursuit of vastly longer human lifetimes is itself a culturally-peculiar goal. To see it as desirable to live forever is to have a particular understanding of what it is to be a person: to place oneself at the centre of the universe, rather than to see oneself as part of a chain of generations.

When I look at my son, I feel gratitude for the chance at life that he has. I hope to live to see him grow strong and take the place that is mine today, as I learn how to grow old and take the place which is now my parents’. And I hope that he will outlive me.

I know that there is much that he and I can almost take for granted, just now, that our ancestors could not. Yet I suspect that my hopes are not so different to theirs, and as I hold him and look into his new face, I understand myself more clearly as a small part within something vastly larger.


First published by Mooria magazine.