Category: Writings

  • Childish Things

    Childish Things

    It was September and I hadn’t seen Ruben all summer, but there he was, the same as ever, gangly and lounging, his hair cropped almost to the bone, his eyes alert; a kid from the wrong side of town who turns the skills his childhood taught him into art. That summer, I’d become a father. The weeks of July and August tightened into the small world of our new family, living by old rhythms of bodily need. (I must have said something about this – about the way it shatters whatever illusions you had of your own centrality, how it locks you into the chain of generations and releases you from any compulsion to make your one life a story in itself.) And I asked him, ‘So, how was your summer? What have you been up to?’

    ‘I gave my sermon on the mount,’ he said, like it was a matter of fact, and it turned out that it was.

    One Friday night, 150 mostly young people had followed him up a rocky hill on the edge of town (the town where he grew up, an hour south of Stockholm) to where the birch trees clear, and they sat on the ground and listened as he spoke. There were no flowing robes; he wore an Adidas tracksuit top and carried a binder with his notes. He wasn’t playing the messiah, trying to start a cult; nor was he playing the artist, making a point by appropriating the forms of religion. As the sun went down over the pines, he talked about life as a journey through the woods at dusk, each of us carrying a pocket-light of reason: its beam cuts a bright tunnel, but throws everything outside this tunnel into darkness; if we use it thoughtlessly, we forget that we have other senses with which to find our way.

    When the sermon and the discussion that followed were at an end, the congregation made their way quietly down among the trees, the twilight deepening around them.

    * * *

    A few years before, I had made a book with the video artists Robert and Geska Brečević, who operate as Performing Pictures. Around the time we met, their work took an unexpected turn as they began collaborating with craftworkers in Oaxaca and Croatia, building roadside chapels and producing video shrines that set the saints in motion. Our book was a document of this work but also an enquiry into how it came about, what had drawn them to the folk Catholicism of the villages where they were now working, and the reactions this had provoked among their art-world contemporaries. About these reactions, I wrote:

    We are used to art that employs the symbols of religion in ways seemingly intended to unsettle or provoke many of those to whom these symbols matter. Yet to the consumers of contemporary art, those who actually visit galleries, it is more uncomfortable to be confronted with work in which such symbols are used without the frame of provocation.

    That may still be the case, yet these days I am struck by how many of the artists, writers and performers I meet find themselves drawn to the forms and practices of religion.

    I think of Ben who went off to Italy to start an ‘unMonastery’, a working community of artists in service to its neighbours. The name suggested a desire to distance themselves from the example of the religious community, even as they found inspiration there. A couple of years facing the difficult realities of holding a community together, however, deepened their appreciation for the achievement of those who had maintained monasteries for generations, and this was reflected in a series of conversations which Ben went on to publish with abbots of established religious orders.

    For some, it’s a question of taking on the roles religion used to play, using the tools of ritual to address the ultimate. When I run into Emelie, a choreographer friend, she’s just back from a small town in the middle of Sweden where a group of artists has taken over the old mine buildings. It’s the kind of place that lost its purpose with the passing of the industry which called it into being. The project started with two brothers who grew up there – and this weekend, they have been celebrating the younger brother’s birthday. The way I hear it, the celebration was a three-day ritual which saw participants building their own coffins only to be lowered into them, emerging after several hours to be greeted with music and lights and a restorative draught of vodka.

    In another mining town a thousand miles away, Rachel Horne made her first artwork at the site of the colliery where four generations of her family had worked. Out of Darkness, Light was a memorial event: one night on the grassed-over slag heap above the town, 410 lamps were lit, one for each of the men and boys who died in the century in which coal was mined there. On a boat travelling along the river below, a group of ex-miners and their children told their stories. This was art as ritual, honouring the dead in such a way as to bring meaning to the living.

    Last time I spoke to Rachel, we talked about an event that she had put on a few weeks earlier. ‘You know,’ she said with a sigh, ‘it was like organising a wedding!’ I knew: months of energy building up to a big day and afterwards everyone involved is exhausted. Weddings are great, but how many do you want to have in a lifetime? It hit me, as artists we’re good at ‘weddings’, but sometimes what’s called for is the simplicity of the weekly Sunday service. Soon afterwards, I came to a passage in Chris Goode’s The Field and the Forest where he quotes a fellow theatre-maker, Andy Smith:

    Every week my mum and dad and some other people get together in a big room in the middle of the village where they live. They say hello to each other and catch up on how they are doing informally. Then some other things happen. A designated person talks about some stuff. They sing a few songs together. There is also a section called ‘the notices’ where they hear information about stuff that is happening. Then they sometimes have a cup of tea and carry on the chat.

    Both Smith and Goode are impressed by the resemblances between the Sunday service and the kinds of space they want to make with theatre. The connection is not made explicit, but when Goode ends his book with a vision of a ‘world-changing’ theatre where ‘once a fortnight at least, there’s someone on every street who’s making their kitchen or their garage or the bit of common ground in front of their estate into a theatre for the evening’, I think back to that passage and the distinction between the wedding and the weekly service.

    * * *

    I could go on for a while yet, piling up examples, but it’s time to pull back and see where this might get us. The artists I’ve mentioned are all friends, or friends of friends, so I can’t pretend to have made an objective survey. I don’t even know if such a survey could be made, since much of what I’m describing takes place outside the official spaces of art. Even the objects produced by Performing Pictures, though they sometimes hang in galleries, are made to be installed in a church or at a roadside.

    There is nothing new, exactly, about artists tangling with the sacred – indeed, the history of this entanglement is the thread I plan to follow through these pages. Yet here in the end-times of modernity, under the shadow of climate change, I want to voice the possibility that these threads are being pulled into a new configuration. There’s something sober – pragmatic, even – about the way I see artists working with the material of religion. The desire to shock is gone, along with the skittering ironies of postmodernism; and if ritual is employed, it is not in pursuit of mystical ecstasy or enlightened detachment, but as a tool for facing the darkness. I’m struck, too, by a willingness to work with the material of Western religious tradition, with all its uncomfortable baggage, rather than joining the generations of European artists, poets and theatre-makers who found consolation in various flavours of orientalism.

    All this has set me wondering: what if the times in which we find ourselves call for some new reckoning with the sacred? What if art is carrying part of what is called for? And what if answering the call means sacrificing our ideas about what it means to be an artist?

    A Strange Way of Talking About Art

    We have been making art for at least as long as we have been human. Ellen Dissanayake has made a lifelong study of the role of art within the evolution of the human animal, and she is emphatic about this:

    Although no one art is found in every society … there is found universally in every human group that exists today, or is known to have existed, the tendency to display and respond to one or usually more of what are called the arts: dancing, singing, carving, dramatizing, decorating, poeticizing speech, image making.

    Yet the way such activity gets talked about went through an odd shift about 250 years ago. In Germany, France and Britain, just as the Industrial Revolution was getting underway – and with colonialism pushing Western ideas to the far corners of the world – a newly extravagant language grew up around art. The literary critic John Carey offers a collage of this kind of language, drawn from philosophers, artists and fellow critics:

    The arts, it is claimed, are ‘sacred’, they ‘unite us with the Supreme Being’, they are ‘the visible appearance of God’s kingdom on earth’, they ‘breathe spiritual dispositions’ into us, they ‘inspire love in the highest part of the soul’, they have ‘a higher reality and more veritable existence’ than ordinary life, they express the ‘eternal’ and ‘infinite’, and they ‘reveal the innermost nature of the world’.

    Bound up with this new way of talking is the figure of the artistic genius. There have always been masters, artists whose skill earns them a place in the memory of a culture. In his account of the classical Haida mythtellers, the poet and linguist Robert Bringhurst is at pains to stress the role of individual talent within an oral literature, where a modern reader might expect to encounter the nameless collective voice of tradition. Yet a fierce respect for mastery does not presuppose a special kind of person whose inborn capacity makes them, and them alone, capable of work that qualifies as ‘art’. Rather, as Dissanayake shows, in most human cultures, it has been the norm for just about everyone to be a participant in and appreciator of artistic activity.

    The ideas about art which took hold in Western Europe in the late 18th century spread outwards through cultural and educational institutions built in Europe’s image. Were anyone to point out their peculiarity, it need not have troubled their proponents, for the contrasting ideas of other cultures could be assigned to a more primitive phase of development. Today, that sense of superiority has weakened and become unfashionable, although it remains implicit in much of the thinking that shapes the world. Under present conditions, a critic like Carey can take glee in mocking the heightened terms in which Kant and Hegel and Schopenhauer wrote about art; yet the result is a deadlocked culture war in which defenders of a high modern ideal of art are pitched against the relativists at the gates.

    Rather than pick a side in this battle, it might be more helpful to ask why art and the figure of the artist should take on this heightened quality at the moment in history when they did. If a new weight falls onto the shoulders of the artist-as-genius, if the terms in which art is talked about become charged with a new intensity, then what is the gap which art is being asked to fill?

    That the answer has something to do with religion is suggested not only by the examples which Carey assembles, but also by the sense that he is playing Richard Dawkins to the outraged true believers in high art. And there have been those, no doubt, for whom art has played the role of religion for a secular age. But this hardly gets below the surface of the matter; the roots go further down in the soil of history. It is time to do a little digging.

    The Elimination of Ambiguity

    In 1696, an Irishman by the name of John Toland published a treatise entitled Christianity Not Mysterious. This was just one among a flurry of such books and pamphlets issuing from the London presses in the last years of the century, but its title is emblematic of the turn that was taking place as Europe approached the Enlightenment: a turn away from mystery, ambiguity and mythic thinking.

    As the impact of the scientific revolution reverberated through intellectual culture, the immediate effect was not to undermine existing religious beliefs but to suggest the possibility of putting them on a new footing. If Newton could capture the mysterious workings of gravity with the tools of mathematics, then the laws governing other invisible forces could be discovered. In due course, this would lead to a mechanical account of the workings of the universe, stretching all the way back to God.

    In its fullest form, this clockwork cosmology became known as deism: a cold reworking of monotheistic belief, offering neither the possibility of a relationship with a loving creator, nor the firepower of a jealous sky-father protecting his chosen people. The role of the deity was reduced to that of ‘first cause’, setting the chain reaction of the universe in motion. Stripped of miracles, scripture and revelation, deism never took the form of an organised religion or gained a substantial following. It attracted many prominent intellectual and literary figures in England, however, in the first half of the 18th century, before spreading to France and America, where it infused the philosophical and political radicalism which gave birth to revolutions.

    The religious establishment recoiled from deism and its explicit repudiation of traditional doctrine. Yet mainstream Christianity was travelling the same road, accommodating its cosmology to the new science in the name of natural theology, applying the tools of historical research to its scriptures and seeking to demonstrate the reasonableness of its beliefs. The result was a form of religion peculiarly vulnerable to the double earthquake which was to come from the study of geology and natural history. Imagine instead that the rocks had given up their secrets of deep time to a culture shaped by the mythic cosmology of Hinduism: the discovery would hardly have caused the collective crisis of faith which was to shake the intellectual world of Europe in the 19th century.

    To this day we live with the legacy of this collision between naturalised religion and the revelations of evolutionary science; militant atheists clash with biblical literalists, united in their conviction that the opening chapters of Genesis are intended to be read as a physics and biology textbook. It is an approach to the Bible barely conceivable before the 17th century.

    * * *

    Mystery can be the refuge of scoundrels; ambiguity, a cloak for muddle-headedness. The sacred has often been invoked as a way of closing off enquiry or to protect the interests of the powerful. We can acknowledge all of this and deplore it without discarding the possibility that reality is – in some important sense – mysterious. It takes quite a leap of faith, after all, to assume that a universe as vast and old as this one ought to be fully comprehensible to the minds of creatures like you and me.

    Among the roles of religion has been to equip us for living with mystery. This is not just about filling the gaps in current scientific knowledge or offering comforting stories about our place in the world. Across many different traditions there is an underlying attitude to reality: a common assumption that our lives are entangled with things which exceed our grasp, which cannot be known fully or directly – and that these things may nonetheless be experienced and approached, at times, by subtler and more indirect means.

    This attitude shows up in the deliberate strangeness of the way that language is used in relation to the sacred. The thousand names of Vishnu, the ninety-nine names of Allah: the multiplication of such litanies hints at the limits of language, reminding us that words may reach towards the divine but never fully comprehend it. A similar effect is achieved by the Tetragrammaton, the four-letter name of God in the Hebrew Bible, written without vowels so as to be literally unspeakable.

    For Christians, a classic expression of this attitude to reality appears in Paul’s first letter to the church at Corinth, from the chapter on love that gets read at weddings:

    When I was a child, I spake as a child, I understood as a child, I thought as a child; but when I became a man, I put away childish things. For now we see through a glass, darkly; but then face to face: now I know in part; but then shall I know even as also I am known. (1 Corinthians 13:11–12)

    The emphasis is on the partial nature of knowledge: in relation to the ultimate, our understanding is childlike, a dark reflection of things we cannot see face-to-face. The most memorable of English translations, the King James Version gives us the image of a ‘glass’, but the mirror which Paul has in mind would have been of polished brass. Indeed, it is carefully chosen, for the Greek city of Corinth was a centre for the manufacture of such mirrors.

    The thought that there are aspects of reality which can be known only as a dark reflection calls up another Greek image. The myth of Perseus is set in motion when the hero is given the seemingly impossible task of capturing the head of the Gorgon Medusa, whose gaze turns all who look on her to stone. The goddess Athena equips Perseus with a polished shield; by the reflection of this device, he is able to approach the monster, hack off her hissing head and bag it safely up. In the shield of Perseus we glimpse the power of mythic thinking: by way of images, myth offers us indirect means of approaching those aspects of reality to which no direct approach can be made.

    Few passages in the Bible are more at odds with the spirit of the Enlightenment than Paul’s claim about the limits of human knowledge. To put away childish things was the ambition of an age in which the light of reason would shine into every corner of reality. What need now for dark reflections – or mythic shields, for that matter? By the turn of the 18th century, such things were no longer intellectually respectable: the unknown could be divided into terra incognita, merely awaiting the profitable advance of human knowledge, and old wives’ tales that were to be brushed away like cobwebs.

    The institutional forms of religion were capable of surviving this turn away from mystery, though much was lost along the way, and none of the later English translations of the Bible can match the poetry of the King James. Meanwhile, if anyone were to go on lighting candles at the altars of ambiguity, it would be the poets and the artists, the ones upon whose shoulders a new weight of expectation was soon to fall.

    Toys in the Attic

    When the educated minds of Europe decided that humankind had come of age, the immediate consequence for art was a loss of status. If all that is real is capable of being known directly, then the role of images and stories as indirect ways of knowing can be set aside, relegated to entertainment or decoration.

    I say immediate, but of course there was no collective moment of decision; we are dealing rather with the deep tectonic shifts which take place below the surface fashions of a culture, and the extent to which the ground has moved may be gauged as much through the discovery of what was once and is no longer possible, like the epic poem. The pre-eminent English poet of the first half of the 18th century, Alexander Pope aspired to match the achievement of Milton’s Paradise Lost by producing an epic on the life of Brutus; yet, despite years of telling friends that the project was nearing completion, all that he left upon his death was a fragment of eight lines. The failure seems more than personal, as though the mythic grandeur of the form was no longer available in the way it had been a lifetime earlier.

    In Paris in 1697, a year after Christianity Not Mysterious had rolled off the London presses, Charles Perrault’s Histoires ou contes du temps passé launched the fairy tale genre, committing the stories of oral tradition to print with newly added morals. By the time the first English translation was printed in 1729 – ‘for J. Pote, at Sir Isaac New-ton’s Head, near Suffolk Street, Charing Cross’ – the publisher could advertise Perrault’s tales as ‘very entertaining and instructive for children’. Stories which had been everyone’s, which carry layers of meaning by which to navigate the darkest corners of human experience, had now been tamed and packed off to the nursery.

    Meanwhile, a strange new form of storytelling arose which put a premium on uneventful description of the everyday and regarded unlikely events with suspicion. ‘Within the pages of a novel,’ writes Amitav Ghosh, ‘an event that is only slightly improbable in real life – say, an unexpected encounter with a long-lost childhood friend – may seem wildly unlikely: the writer will have to work hard to make it appear persuasive.’ A masterful novelist himself, Ghosh is nonetheless troubled by the 18th-century assumptions encoded within the form in which he writes. What troubles him most is the thought that these assumptions underlie the failure of the contemporary imagination in the face of climate change.

    In the kinds of story which our culture likes to take seriously, all of the actors are human and most of the action takes place indoors. Such realism is ill-equipped to handle the extreme realities of a world in which our lives have become entangled with invisible forces, planetary in scale, which break unpredictably across the everyday pattern of our lives. The writer who wants to tell stories that are true to this experience had better go rummaging in the attic where the shield of Perseus gathers dust among the toys, the sci-fi trilogies devoured in teenage weekends and the so-called children’s literature where potent materials exiled to the nursery grew new tusks.

    But writer, beware: the boundaries of the serious literary novel are still policed against intrusions of myth or mystery, and the terms used to police them are telling. In notes for a never-finished review of Brideshead Revisited, written on his own deathbed, George Orwell marks his admiration for Waugh as a novelist, but then comes the breaking point: ‘Last scene, where the unconscious man makes the sign of the Cross … One cannot really be Catholic and a grown-up.’ Almost half a century later, Alan Garner met with the same charge when his novel Strandloper was published as adult literary fiction. The Guardian’s reviewer, Jenny Turner, found the author guilty of crossing a line with his insistence on depicting Aboriginal culture on its own terms:

    … such a phantastic view of history cannot ever rationally be made to stand up. This underlying irrationality usually works all right in poetry, which no one expects to make a lot of sense. It’s okay in children’s writing, which no one expects to be psychologically complex. But in a grown-up novel for grown-ups, it just never seems to work.

    Carrying the Flame

    As Paganini … appeared in public, the world wonderingly looked upon him as a super-being. The excitement that he caused was so unusual, the magic he practised upon the fantasy of the hearers so powerful, that they could not satisfy themselves with a natural explanation.

    So wrote Franz Liszt on Paganini’s death in 1840. The Italian violinist and composer had been the model of a virtuoso: a dazzling performer who stuns audiences with technical audacity and sheer force of personality. The term itself had taken on its modern meaning within his lifetime, shaped by his example. In those same years, an unprecedented cult of personality grew up around the Romantic poets, while in the theatres of Paris and London a strange new convention had emerged, according to which audiences sat in reverential silence before the performers; half a century earlier, theatres were still such rowdy spaces that an actor would be called to the front of the stage to repeat a favourite speech to the hoots or cheers of the crowd.

    A new sense was emerging of the artist as a special category of human. The conditions for this had been building for a long time. In ‘Past Seen from a Possible Future’, John Berger argues that the gap between the masterpiece and the average work has nowhere been so great as within the tradition of European oil painting, especially after the 16th century:

    The average work … was produced cynically: that is to say its content, its message, the values it was nominally upholding, were less meaningful for the producer than the finishing of the commission. Hack work is not the result of clumsiness or provincialism: it is the result of the market making more insistent demands than the job.

    Under these conditions, to be a master was not simply to stand taller than those around you, but to be looking in another direction. In the language of Berger’s essay, such masterworks ‘bear witness to their artists’ intuitive awareness that life was larger’ than allowed for in the traditions of ‘realism’ – or the accounts of reality – available within the culture in which they were operating. Dismissed from these accounts were those aspects of reality ‘which cannot be appropriated’.

    Berger warns against making such exceptions representative of the tradition: the study of the norms constraining the average artist will tell us more about what was going on within European society. Still, exceptionality of achievement fuelled the Romantic idea of the artist set apart from the rest of society. If the Enlightenment established lasting boundaries around what it is intellectually respectable for a ‘grown-up’ to take seriously, then the Romantic movement inaugurated a countercurrent which has proven as enduring. In Culture and Society, Raymond Williams identifies a constellation of words – ‘creative’, ‘original’ and ‘genius’ among them – which took on their current meanings in the late 18th and early 19th centuries, as part of this new way of talking about the figure of the artist.

    The artists themselves were active in creating this identity. Here is Wordsworth, in 1815, addressing the painter Benjamin Haydon:

    High is our calling, Friend! – Creative Art …
    Demands the service of a mind and heart
    Though sensitive, yet in their weakest part
    Heroically fashioned – to infuse
    Faith in the whispers of the lonely Muse
    While the whole world seems adverse to desert.

    Keats’ formulation of ‘Negative Capability’, the quality required for literary greatness, is among the clearest statements of the role which now falls to the artist, a figure who must be ‘capable of being in uncertainties, mysteries, doubts, without any irritable reaching after fact and reason.’

    * * *

    I have been making a historical argument, though it is the argument of an intellectual vagabond who goes cross-country through other people’s fields. Since we are now coming to the height of the matter, let me take a moment to catch my breath – and recall an earlier attempt at covering this ground, made in the third chapter of the Dark Mountain manifesto:

    Religion, that bag of myths and mysteries, birthplace of the theatre, was straightened out into a framework of universal laws and moral account-keeping. The dream visions of the Middle Ages became the nonsense stories of Victorian childhood.

    The claim towards which I have been building here is that those elements which became increasingly marginalised within respectable religious and intellectual culture by the middle of the 18th century found refuge in art. In many times and places, and perhaps universally, the activity of art has been entangled with the sacred, with the rituals and deep stories of a culture, its cosmology, the meaning it finds or makes within the world – and all of this wound into the rhythms which structure our lives. What is new in the historical moment around which we have been circling is the sense that the sacred has passed into the custody of art: insomuch as it dwells with mystery, ambiguity and mythic thinking, it now fell to the artist to keep the candle alight. Here, I submit, is the source of the peculiar intensity with which the language of art and the figure of the artist is suddenly charged.

    * * *

    If art has carried the flame of the sacred through the cold landscapes of modernity, it has not done so without getting burned. The scars are too many to list here, but I want to touch on two areas of damage.

    First, the roles assumed by artists over the past two centuries have overlapped with those which might in another time or place have been the preserve of a priest or prophet. In a culture capable of elevating an artist to the status of ‘super-being’, there is a danger here: the framework of religion may remind adherents that the priest is only an intermediary between the human and the divine, but there are no such checks in the backstage VIP area. The danger is that the show ends up running off the battery of the ego instead of plugging in to the metaphysical mains. Even when an artist sees her role as a receiver tuned into something larger than herself, without a common language in which to speak of the sacred, the result may be esoteric to an isolating degree. How much of the self-destruction which becomes normalised – often romanticised – as part of the artistic life can be traced to the lack of a stabilising framework for making sense of the mysteries of creative existence?

    Another danger arises from the exceptional status of the artist. While the reality of artistic life is often precarious, there exists nonetheless a certain exemption from the logic which governs the lives of others: the artist is the one kind of grown-up who can move through the world without having to explain their rationale, whether monetary, vocational or otherwise. In theory, at least, if you can get away with calling yourself an artist, you will never be required to demonstrate the usefulness, efficiency or productivity of your labours. Where public funding for the arts exists, if you can prove your eligibility, you may even join the privileged caste of those for whom this theory corresponds to reality. (And you may not: ‘performance targets’ for funded arts organisations can be punishingly unreal.)

    The danger of the artistic exception is that it serves to reinforce the rule: get too comfortable with your special status as the holder of an artistic licence and you risk sounding at best unaware of your privilege, at worst an active collaborator in the grimness of working life for your non-artist peers. (Arguably, the only ethical model of artistic funding is a Universal Basic Income, which is how many young writers, artists and musicians approached the unemployment benefits system of the UK as recently as the 1980s.)

    Begin Again

    And here we are, back in the early 21st century, where the legacies of the Romantics and the Enlightenment are both persistent and threadbare. We don’t know how to think without them, and yet they seem out of credit, like a congregation that attends out of habit rather than conviction, or not at all.

    A few years back, there was a fire at the Momart warehouse in east London. Among the dozens of artworks that went up in smoke were Tracey Emin’s tent and the Chapman brothers’ Hell. John Carey has some fun setting the reactions of callers to radio phone-ins against all those high-flown statements about the spiritual value of art: ‘Only in a culture where the art-world had been wholly discredited could the destruction of artworks elicit such rejoicing.’

    Under these conditions, do I truly propose to lay a further weight on the shoulders of my artist friends – to charge them with the task of reconfiguring the sacred? Not quite.

    If art gave refuge to the sacred and served as its most visible home in a time when it was otherwise scoured from public space, I believe the time has come for art to let it go. In the world we are headed into, it won’t be enough for an artist caste to be the custodians, the ones who help us see the world in terms that slip the net of measurable utility and exchange. One way or another, the ways of living which will be called for by the changes already underway include a recovery of the ability to value those aspects of reality which cannot be appropriated, which elude the direct gaze of reason, but which so colour our lives that we would not live without them.

    This is not a call for a new religion, nor for a revival of anything quite like the religions with which some of us are still familiar. I have met the sacred in the stone poetry of cathedrals and the carved language of the King James Bible, but buildings and books never had a monopoly. For that matter, art was not the only place the sacred found shelter, nor even the most important – though it was the grandest of shelters and the one that commanded most respect, here in the broken heartlands of modernity. Out at the places we thought of as the edges, there were those who knew themselves to be at the centres of their worlds, and who never thought us as clever as we thought ourselves. Even after all the suffering, after all the destruction of languages and landscapes and creatures, there are those who have not given up. But if we whose inheritance includes the relics of Christianity, Enlightenment and Romanticism have anything to bring to the work that lies ahead, then I suspect that one of the places it will come from is the work of artists who are willing to walk away from the story of their own exceptionality.

    And though I know that I am drawing simple patterns out of complex material, it seems to me that something like this has begun, at least in the corners of the world where I find myself. I don’t think it is an accident that several of the artists I have invoked here returned to work in the towns where they grew up; the pretensions you picked up in art school are not much use on the streets where people knew you as a child.

    Unable to appeal to the authority of art, you begin again, with whatever skills you have gathered along the way and whatever help you can find. You do what it takes to make work that has a chance of coming alive in the spaces where we meet, to build those spaces in such a way that it is safe to bring more of ourselves. This does not need to be grand; you are not arranging a wedding. A group of strangers sits around a table and shares a meal. A visitor tells a story around a fire. You half-remember a line you heard as a child, something about it being enough when two or three are gathered together.


    Published in Dark Mountain: Issue 12 ‘SANCTUM’, a special issue on the theme of ‘the sacred’.

  • How Climate Change Arrives

    How Climate Change Arrives

    The sun is out, the sky is a cloudless blue and the kids around me on the train are talking football. On mornings like this, it’s hard to hold onto the sense that we are in trouble, let alone that this trouble might be deep enough to derail our whole way of living.

    Even the numbers involved are underwhelming: two degrees of warming by the end of the century, three degrees, four… We’ve heard all the warnings, and still it is hard to equate these numbers with disaster, when they are smaller than the variations on the weather map from one day to the next.

    The year before last, I got a Facebook message from a Sami woman, a reindeer herder whose family follows the animals north each summer across the mountains from Sweden to Norway. A few days later, we were sitting drinking coffee in a meeting room in Stockholm. She talked about a journey to fix up her uncle’s cabin in early May, travelling on a winter road, the kind of road that runs across a frozen river. The river is always frozen until the third week of May — you can count on it — but this time, when they get there, it already thawed. There’s no getting across. Further north, the same summer, they come to a mountain where they always store food in the ice of a glacier, but this year the glacier is gone. In July, the temperature stays over thirty for three straight weeks as the reindeer huddle, miserable in the heat. This is not the future, not a warning about what happens if we fail: this is how things are, already. If your life is bound to the seasons, you don’t need charts or projections to know that something is going badly wrong.

    Our lives are bound to other things. Where we live, you can change seasons almost as easily as channels on the TV. Summer or winter are only ever an air ticket away. We see strawberries in Tesco in December and the strangeness of this hardly registers. Our liberation from the constraints of the seasons is assumed to be progress, but it might be wiser to call it an illusion. All that food in the supermarket is coming from places where the seasons still count. We still live off soil and sun and rain. There is no question of going ‘back to the land’, because we never left: we just stretched the chains that link us to it so far, we lost sense of what lies at the other end.

    For now, a sharp tug on the supply chain means an unwelcome bulge in our grocery bills, a corner to cut somewhere else in the household budget. Elsewhere, the consequences cut deeper. The Arab Spring started when Tunisian police confiscated the fruit stall of street-trader Mohamed Bouazizi. He burned himself to death in a desperate protest against corruption, but the waves of protest that followed across North Africa and the Middle East were fuelled by years of sharply rising food prices. The brutal war in Syria came on the heels of five years of drought. This is how climate change arrives, not as a clean case of cause and effect, but tangled up with the cruelties of dictators and the profits made from commodity market speculation, washing up in boats on package holiday coastlines.

    I don’t mean this as a call to guilt or despair. If you write about climate change, there’s a pressure to be upbeat, to talk about changing lightbulbs and the falling cost of solar panels. Not long ago, Britain went a day without burning coal for the first time since the Industrial Revolution. These things are also part of the story. I want to tell you, too, about all the knowledge that is barely on the maps we were given at school. Like how, even today, only 30% of the world’s food is produced within the agro-industrial system, while half of it is grown by peasant farmers, people who still have one foot in ways of making life work that are older than the fossil fuel economy. A Somerset farmer has three Syrian teenagers sent to him on a scheme: the first morning, they clear a weedy patch of land in no time, then one lad picks up a handful of soil and squeezes it in his hand. ‘Good humus,’ he says. Those already living with the consequences of climate change are not simply victims, they may yet be carriers of badly-needed knowledge in the tight times ahead.

    So yeah, I don’t want to doom you out. I just think we owe ourselves a little sobriety, a willingness to look hard at where we find ourselves and get a sense of what may be at stake. That last bit is tricky: one moment, we’re urged to ‘Save the Planet’ — like the stars of a superhero movie — and the next, we’re looking at a poster behind the Marks & Spencer’s checkout that says, ‘Plan A: Because there is no Plan B.’ And the more times you look at that poster, the more you have to ask, ‘No Plan B for who?’ For M&S and Tesco and strawberries in December and holidays in the Greek islands — or for liveable human existence? Or is that not a distinction we’re willing to consider?

    Don’t get me wrong: I’ll be stopping in at the supermarket when I pick up my son from nursery this afternoon. It’s just that my dad can remember when the supermarkets arrived: my gran would ride half way across Birmingham and back on the buses to claim the free frozen chicken you got on opening day. I can’t pretend the convenience doesn’t suit me. But if we’re really saying the future of our 4.5 billion-year-old planet is in doubt, then I’m not sure it’s wise to stake everything on getting to hang onto a way of doing things that’s been around for less than a lifetime.


    This essay first appeared in The Precariat, a newspaper published by the organisers of Planet B festival and distributed in Peterborough in July 2017.

  • The Fall of the Murdoch Wall

    The Fall of the Murdoch Wall

    The kaleidoscope has been shaken, the pieces are in flux, soon they will settle again. Before they do, let us reorder this world around us…

    Tony Blair, 2 October 2001

    I didn’t make it to bed on election night, so it took till Saturday morning to have the experience of waking up in this new reality. All day, I felt a lightness, like the laws of physics just changed slightly — and scrolling through Facebook, I see others trying to make sense of this strange sensation. Mixed in among these posts, though, there are others that boil down to, ‘Will you all stop smoking whatever it is you’re smoking?’

    With that in mind — and with one or two sobering caveats — I want to explain why I’m convinced what happened last Thursday is among the two most important and hopeful events in British politics in my lifetime. And why that’s still true, even if you have no time for Corbyn’s politics or his party. (In which case, you can probably skip the next couple of paragraphs.)

    First, the sobering bit. Labour lost — it just lost less badly than everyone expected. May is back in Downing Street, promising another five years of Tory rule, only this time propped up by the even-nastier party. There’s plenty been said already about why the role of the DUP is troubling — not least, its potential to jeopardise what must be the most important and hopeful development in British politics in most of our lifetimes, peace in Northern Ireland. Oh yes, and meanwhile, a prime minister who couldn’t manage a competent election campaign is about to embark on the multidimensional chess of the Brexit negotiations.

    Now, you can come back against some of that: Labour’s vote grew by more than at any election since 1945, the party has momentum on its side, and neither May nor anyone else will be leading a Tory government for a full term. If it doesn’t fall sooner, a handful of lost by-elections will wipe out this government’s majority. (A thought sure to concentrate the minds of by-election voters — and Westminster averages about five by-elections a year.)

    But I want to talk about something more important.

    We’ve just had an election in which the full weight of The Sun and The Daily Mail was thrown at destroying Jeremy Corbyn and the Labour party — and, by any standards, failed to do so. This is so big that, among the rest of the post-election turmoil, I don’t think we’ve grasped what it means yet.

    Since the 1980s, British politics has been locked in a basement by a gang of abusers, systematic perverters of democracy, chief among them Rupert Murdoch and Paul Dacre. 8 June, 2017 should be remembered as the day that we escaped.

    That look you see on the face of Labour MPs who spent two years opposing Corbyn at every turn — it’s the baffled gaze of battery chickens who find the door to their cage left open. Their every reflex was formed by fear of a corrupted and corrupting media. Now, they are disoriented by the possibility of freedom.

    I want to talk about why the current Labour leadership is strangely well-placed to take advantage of this new altered reality — and why seeing what just happened in these terms may be more helpful, when it comes to bridging divides, than assuming that resistance to Corbyn within the parliamentary party was all about ideological divisions.

    So, there are going to be four parts to this story: the first is about the British media, the second about Labour, the third about the Tories, and the fourth about what kind of an event this is — and where things go next.

    The Media

    When Corbyn was elected leader of the Labour party, the British press went into overdrive. According to a study by LSE researchers, only 11% of articles about Corbyn represented his views without alteration; in 74% of articles, his views were either ‘highly distorted’ or not represented at all. The leader of the parliamentary opposition was systematically delegitimised ‘through lack of voice or misrepresentation’, ‘through scorn, ridicule and personal attacks’ and ‘through association’ with terrorists and dictators.

    A newspaper can be as partisan as its editor and owner want it to be, but UK broadcasters are subject to a duty of impartiality. Yet the BBC seems to have been at best powerless to stop — and at worst complicit in — the capture of British democracy by a small ring of powerful abusers. It became so systematic, so embedded in the culture, that complaints weren’t taken seriously: when the BBC Trust upheld a complaint over a report in which Laura Kuenssberg made it look like Corbyn was answering a different question to the one he was asked, the director of BBC News dismissed the finding. Eighteen months later, in the final days of the election campaign, that misleading clip was still being shared widely with nothing to alert viewers to the upheld complaint. Overall, the role of broadcasters has been to recycle and amplify the newspaper attacks on Corbyn—something Barry Gardiner called out the Today programme over early in the election campaign.

    The intensity of the press attacks on the current Labour leader may have been unprecedented, but it is part of a pattern of abuse that goes back — well, how far? I’ll be 40 this year, I’ve followed every UK election since 1987 (listening to Radio 4 on a transistor radio in the school playground), and I can’t remember a time when we had a normally-functioning democracy.

    But the point where it became undeniable was the 1992 election and the famous front-page claim: ‘It’s The Sun Wot Won It’. Whether that was true hardly matters — for the next 25 years, British politics has been conducted on the assumption that it was. Until last Thursday.

    The Breaking of Labour

    Like a lot of the manoeuvres accompanying the birth of New Labour, Tony Blair’s courtship of Rupert Murdoch could be cast as a necessary evil. Yet there was always an excess to it; a suspicion that submission to Murdoch left him feeling excited, rather than sullied. The sense of betrayal which many Labour people feel when they think of Blair is usually explained in terms of Iraq, or of a preference for purity and principles over power; but when you think about what The Sun had done to Labour the last time around, the way Blair cultivated — and took pleasure in — his power-friendship with its owner was a fuck-you to the movement he was meant to be leading. And the impression of a weird edge to their relationship was bizarrely confirmed, after he’d left office, when he first became godfather to one of Murdoch’s daughters, then got accused by News Corp insiders of having an affair with Murdoch’s soon-to-be ex-wife.

    Gordon Brown’s experience with the press was more straightforwardly miserable. He fretted about what Murdoch would say, but lacked Blair’s knack for flirting with Labour’s natural enemies, and his attempts came off clumsily. (Remember the time he invited Margaret Thatcher to Downing Street?) Thinking back on the tormented figure he cut, the stories of rages and sulks and thrown computer hardware, I’m wondering now — was this the behaviour of a decent man who thought politics was a serious business, but found himself trapped instead inside a game where every move had to be calculated for how it would play on the front of the next day’s Sun?

    This brings us to Ed Miliband. Of all the politicians from New Labour’s ‘next generation’, he came closest to seeing the possibilities which Corbyn has now made a reality. Even his much-mocked meeting with Russell Brand in the closing days of the 2015 election campaign looks a lot less daft, given the wave of young and disenfranchised voters who showed up at the polls last week. But the instincts that drew Miliband in this direction were tripped up by a tendency to hesitation and to pessimism about politics.

    Two quick stories that show this.

    First, in December 2010, as the student movement started kicking off, Miliband apparently wanted to come down to the UCL occupation and talk with the students — an idea that divided his advisors, and that ultimately didn’t happen. Now, we can all guess what Corbyn would have done in his place, but the point is that Miliband’s instinct was to do the same thing — yet the supposed boundaries of what you can and can’t do in British politics, without getting destroyed, made him hesitate.

    Another story… Ten years ago, I became an internet entrepreneur by accident. A small project snowballed into an educational web start-up, and by the summer of 2007, one of my co-founders was faced with a decision— was he willing to commit to the responsibility with which we were about to find ourselves? When we met, he’d been working at a think tank with close ties to New Labour — and one Sunday morning at a festival, he ran into an old friend who was now Miliband’s speechwriter. As they were talking about the choice he faced, Miliband himself strolled up and sat down beside them. Having listened for a while, he said, ‘You know, if I could start again, I’d be a social entrepreneur. That’s how you really change the world.’

    And then came Jeremy Corbyn. What mattered about this Labour leader was not that he came from so far to the left, but from so far outside the game of ‘realistic’ politics which had led the likes of Miliband into that kind of pessimism about what politics could do. Meanwhile, the certainty of all the players within that game that he was headed for destruction meant he was spared the counsel of the kind of cautious advisors who fed Miliband’s hesitancy — because, for the past two years, those people just wouldn’t touch Corbyn with a bargepole.

    The Spoiling of the Tories

    The damage done to British politics by this decades-long cycle of abuse is obviously asymmetric—maybe I’m wrong, but I can’t see many areas in which the Tories’ desires have been constrained by the influence of Murdoch and Dacre. Yet, in their different ways, both parties have been deformed by that influence.

    While the systematic abuse of democracy bred a broken generation of politicians in the Labour party, it gifted the Tories a spoiled generation:

    • Some of them appear to truly believe the grim picture of the country they aspire to govern peddled by papers like the Daily Mail.
    • Others were trained in the arts of distortion and fabrication through earlier careers as journalists and columnists — and assume these skills are adequate to the task of governing a country.
    • None of them has had to engage in a real democratic tussle over the direction of the country, where their opponents don’t enter the ring already hamstrung.

    That’s how a party once led by Winston Churchill ends up with a prime minister who resembles a malfunctioning robot — and a clownish con-man as its leader-in-waiting.

    Again, this story goes back decades — but it came to a crunch in the past year. For just when, in Corbyn, Labour at last had a leader who didn’t fear the right-wing press, the Tories found themselves led by someone who aligned herself more tightly to them than her predecessors. Theresa May sought to govern Britain as an avatar of the Daily Mail. As Anthony Barnett wrote in October, this meant a shift away from the dominance of Murdoch — which had lasted from the Thatcher era, through the New Labour years, and survived the phone-hacking scandal (in which David Cameron’s director of communications, the former News of the World editor Andy Coulson, was sent to jail). More than this, as Will Davies points out, the economic irrationality of Brexit left May’s Conservatism more dependent on both Dacre and Murdoch: in contrast to Thatcherism, ‘it can’t rely on cheerleading from the CBI or the Financial Times.’

    So the scene was set for the general election of 2017. It was not the threatened Brexit election — nor was it quite an election on the radical promises made by Corbyn’s Labour. (That’s what the next one will be about…) Rather, what we got was a Tory prime minister who had tied herself to the masthead of the Daily Mail versus a Labour leader with the guts to bet that the emperors would turn out to be naked. If the question was ‘Who governs Britain?’, the surge in support for Labour gave a resounding answer: not the Dacres and the Murdochs.

    And yet, among the rest of the past week’s noise, not everyone has heard — with Michael Gove returning to the front bench, the chatter is of Murdoch’s influence over the Tories rising again. Well, long may that continue.

    What kind of event was this election?

    This feels like the angriest and the most hopeful thing I’ve written in years. Thinking about the role of Murdoch and Dacre and their co-conspirators, the hold they’ve had over democracy in the UK, I keep coming back to phrases that suggest sexual abuse — and maybe that’s distasteful, I don’t know, but the anger hits me like it did when the BBC finally had to face up to having filled our childhood afternoons with celebrity paedophiles. Maybe it’s because of how long it’s gone on, how many people have known and treated it as just how things are. And maybe I’ve no right to use such an analogy, because it’s not something that’s ever been done to me. Honestly, I don’t know.

    What I do know is that we have another frame of reference for what happens when a gang of unelected bullies takes political control over a country and turns its ‘democracy’ into a pantomime, staged within limits which they get to determine. When I was a kid, half of Europe fitted that description — and then, one autumn, young people called the bluff of the people who thought they ran their countries, and it all came down faster than anyone could believe.

    I’m not saying what just happened is as big as the fall of the Berlin Wall, but the hope that’s mixed with the anger is because I think it might just be the same kind of event.

    What do I mean? Well, firstly, that this isn’t a swing of the pendulum. Word is that Murdoch stormed out of The Times’ election party when the exit poll was announced — and well he might, because we are never going back to a world in which he gets to determine what the Labour party can or can’t do. And that means that Britain is a democracy again — not a perfect democracy, and with an electoral system that’s badly in need of reform, but a country where real democratic change feels possible.

    The generational nature of what happened matters, too. Because it gives the lie to all the smug bullshit about young people not caring — and because those young people will be back, next time around — and because, just in terms of demographics, it means that the fear-fuelled, tabloid Toryism of this election campaign is on its way out.

    A wall that ran down the middle of British society has been breached, and my guess is there are still more people pouring through it in the days since the election. That breach isn’t going to go away because the Tories find a less robotic front-person.

    As for Labour, it’s a strange chance that not only does the party find itself with a fearless and vindicated leader, but, in Tom Watson, a deputy leader who took on the Murdoch empire with courage over phone-hacking — making him a strikingly appropriate figure to help the party orient itself to a world in which Murdoch and his like are no longer to be feared.

    A final thought (or three)

    A few years ago, I sat in the office of an editor in Prague, a man who had been among the crowds in Wenceslas Square in those late autumn weeks of 1989. He’d been a student, then, and we talked about the disillusionments that followed.

    ‘I’ve lived 21 years under communism,’ he said slowly, ‘and 21 years under capitalism — and I can tell you what’s wrong with both.’

    Did he ever regret what they had done, I wondered?

    ‘I don’t regret what we did,’ he replied. ‘I regret what we let the grown-ups do, after we went home.’

    The fall of the Murdoch wall may be a huge thing for British democracy, but its rise was part of something bigger that stretches far beyond the rainy islands where I did my growing up. One day soon, I need to write up a set of thoughts that have been gathering for a year or so, about how we map the politics of ‘neoliberal realism’ and the search for the exits — a story that takes in the Brexit vote and Corbyn’s rise, but also the shifting political landscape in other corners of Europe.

    Meanwhile, beyond all this, there is the low background roar of loss, the knowledge that we are living in an age of endings. I’m writing this late at night, after the first day of a meeting on ‘rapid decarbonisation’ — and the message from the scientists here is beyond sobering. At times, it’s hard to hold in view the different scales of crisis: the unravelling of an economic ideology that’s less than a lifetime old, playing out against the backdrop of the end of a 10,000 year mild period in the Earth’s climate which happens to have encompassed all that we’ve known as civilisation, and an ongoing mass extinction, the sixth the planet has seen in its long life. All these endings are entangled with each other. We have brought about an almighty bottleneck, and it’s hard to say in what shape our kind will come through it, except that the journey will change us in ways beyond the imaginings of the things I’ve been writing about here.

    But if I stare at these realities and still, despite the woeful absence of such matters from the debate in this election, see some hope in the unexpected wave that just washed through Britain’s political system, it’s because it will take waves like this — sudden ruptures that spread like rumours through the spaces of conversation and networks of relations that make up our lives — if things are to turn out better than often seems likely in the tight times that lie ahead.

    I was going to wrap this up by saying something like, ‘Don’t go home and let the grown-ups fuck it up.’ But then I read Dan Hancox’s piece this morning on the extraordinary surge of grassroots campaigning that produced last week’s results, and I’m like — go home? As if you would. As if any of us are about to do that, now.


    Published on Medium in the wake of the 2017 UK general election.

  • You Want It Darker

    As things stand, I don’t believe we will get a story worth hearing until we witness a culture broken open by its own consequence.

    Martin Shaw, Dark Mountain: Issue 7

    The regular mechanisms of political narration are breaking down. The pollsters lose confidence in their methods, the pundits struggle to offer authoritative explanations for events that they laughed off as wild improbabilities only months before.

    It’s a measure of how badly things have broken that, over the past year or two, members of the strange crew that meets around Dark Mountain have found ourselves filling the gap. I’m thinking of posts we’ve written in our various corners of the internet that were read and shared far more widely than most of us are used to, seemingly because they helped readers find their bearings in a time of deepening disorientation.

    There’s a role for this kind of writing now that seems clearer than it did eight years ago, when we started this project. That’s why, today, we are launching a fundraising campaign – asking for your help to build and launch a new online publication. It won’t replace the Dark Mountain books, but it will run alongside them and provide an online home for writing that seeks – as my co-founder, Paul Kingsnorth put it at the start of this series – ‘to make sense of things, and to examine our stories in their proper perspective.’

    At this point, if you want to head straight for our fundraising page and make a donation, then be my guest – but in the rest of this post, I want to make a few suggestions about why this kind of writing matters now, based on what Dark Mountain has taught me over the past eight years.

    * * *

    Let’s start with a few of the pieces I mentioned – the chances are you already read some of these, but setting them alongside one another, something else comes into view:

    These are posts that got shared and reblogged and quoted and seemed to travel halfway around the internet. Mostly, they were written for our personal blogs or websites – but the authors are editors or regular contributors here at Dark Mountain. You can see places where we spark off each other’s ideas, as well as significant differences in perspective. If you read them all, you’ll probably find some that jive with you and others that jar. But I want to point to some common ground.

    For one thing, while we draw on different political traditions, this is writing that starts a couple of steps back from the familiar terrain of political debate and analysis. I’m reminded of an answer I gave, years ago, when asked if Dark Mountain was a political project: ‘I think there may be times when it is necessary to withdraw from today’s politics, in order to do the thinking that could make it possible for there to be a politics the day after tomorrow.’ Or as Paul put it at the opening of this series, ‘Sometimes you have to go to the edges to get some perspective on the turmoil at the heart of things. Doing so is not an abnegation of public responsibility: it is a form of it.’

    If you start exploring the work of any of these writers, you’ll find that mythology is a recurring reference point, a deep element in how we make sense of things. At the end of his post from the morning after the Brexit vote, Martin Shaw wrote, ‘Television, radio and internet will be able to tell you all the above-ground implications of what’s just taken place.’ When these surface accounts fail to satisfy, though, there’s a hunger that is fed by the underground currents of old stories.

    One of the things that marks out this writing, then, is a willingness to enter territory that we could call ‘liminal’. It’s a term that comes from the study of ritual, given to the middle phase of a rite of passage: the preliminaries are over, you have shed the skin of an old reality, but not yet acquired the new skin that would allow you to return to the everyday world. The liminal is the space of the threshold, with all the vulnerability and potential of transition: the costliness of letting go, with no guarantee of what will come after. The liminal phase of a ritual is the moment of greatest danger – or rather, ritual is a safety apparatus built around the liminal. Whichever, the liminal is where the work gets done, where the change happens.

    So here’s the first suggestion I want to make: if this writing is filling a gap left by the failure of more conventional kinds of political narration, it’s because it is able to operate in the territory of the liminal, and these are liminal times.

    * * *

    It’s not just the broadening audience for this writing that points to its timeliness. The past year also saw more conventional voices getting drawn into the territory that Dark Mountain has been exploring.

    Take Alex Evans, a former advisor to the UK government and the United Nations, who just wrote a book called ‘The Myth Gap’. After a career based on belief in the power of ‘evidence, data and policy proposals’, his experience of global climate negotiations brought him to a crisis, and to a sense of the need for something more than facts and reasoned arguments. ‘We’ve lost the old stories that used to help us make sense of the world,’ he says, ‘but without coming up with new ones.’ And he quotes Jung: ‘The man who thinks he can live without a myth is like one uprooted, having no true link either with the past, or the ancestral life within him, or yet with contemporary society.’

    Or check out the series on ‘spirituality and visionary politics’ that the political strategist Ronan Harrington edited for Open Democracy last year – and Jonathan Rowson’s report on spirituality for the RSA. ‘Scratch climate change confusion long enough,’ writes Rowson, ‘and you may find our denial of death underneath.’

    There’s lots to say about these examples, but for now I just want to take a couple of points from them. First, that the call of the liminal is making itself felt ‘above ground’. But then, that there is a danger of wanting to jump straight to rebirth, to promise bright visions and new positive narratives. Evans draws on Jung, but I’m not clear how much room there is here for the shadow – nor for the loss and uncertainty, the darkness and disorientation that are the price for entering the liminal.

    Then again, by the end of 2016, others were ready to make the descent. I once spent an hour on stage with George Monbiot pounding me over the pessimism of Dark Mountain, so it was striking to read his list of ‘The 13 impossible crises humanity now faces’. Then you had John Harris discovering Tainter’s The Collapse of Complex Societies. Watching experienced journalistic commentators move in the terrain that Dark Mountain has been exploring for the best part of a decade, it strikes me that there is another danger. To navigate at these depths, you need a different kind of equipment. Facts alone don’t cut it down here.

    This brings me to the other aspect of Dark Mountain which may be crucial to finding our bearings within the liminal – the centrality of art and culture to the work of this project.

    * * *

    A man is whispering in your ears, disorienting you, playing tricks with your perception, even as you watch him alone on stage with little more than a few bottles of water and a cast of microphones. This is Simon McBurney’s The Encounter, one of the most staggering pieces of theatre I witnessed in 2016: a show that leads you into the story of a meeting between a photographer lost in the Amazon and a tribe whose world is under threat. Their response to this threat takes the form of a ritual, a journey to ‘the beginning’, which is also a deliberate bringing to an end of their culture in its current form.

    The concept of liminality was first used to describe the structure of rituals like the one at the centre of The Encounter, but its application as a term for thinking about modern societies is connected to the study of theatre and performance. The anthropologist who made the connection, Victor Turner, distinguished the ‘liminal’ experiences of tribal cultures – in which ritual is a collective process for navigating moments of change – from the ‘liminoid’ experiences available in modern societies, which resemble the liminal, but are choices we opt into as individuals, like a night out at the theatre. This distinction comes with a suggestion that true liminality, the collective entry into the liminal, is not available within a complex industrial society.

    Now, perhaps this has been true – but here’s my next wild suggestion. The consequences of that very complex industrial society are now bringing us to a point where we get reacquainted with true liminality. To take seriously not just what Dark Mountain has been talking about, but what Monbiot and Harris are touching on, is to recognise that we now face a crisis which has no outside. The planetary scale of our predicament makes it as much a collective experience as anything faced by the tribal cultures studied by Turner and his colleagues.

    If this is the case, then where within our existing cultures do we go for knowledge about how to navigate the terrain of liminality? Not to the sources of factual authority, much as we need them, but to the places where liminoid practices have endured – to the arts, especially those forms in which people gather and share a live experience, and also (Turner would tell us) to those traditions and institutions that deal with the sacred.

    In 2016, I came to the end of two years working as leader of artistic development with Riksteatern, Sweden’s touring national theatre. The collaboration came about because their artistic director had been strongly influenced by the Dark Mountain manifesto. In the workshops we ran together, writers, directors and performers met around the question of what art can do, in the face of all that we know and fear about the depth of the mess the world is in.

    The answers that emerged began with a rejection of the usual invitation to put our art to use as a communications tool to deliver a message on behalf of scientists, policy-makers or activists – not out of some misplaced sense of ‘art for art’s sake’ purity, but because this isn’t how art works. 

    Instead, many of the possibilities I caught sight of during this work had to do with the liminal. Art can hold a space in which we move from the arm’s-length knowledge of facts, figures and projections, to the kind of knowledge that we let inside us, taking the risk that it may change us. Art can give us just enough beauty to stay with the darkness, rather than flee or shut down. Like the bronze shield given to Perseus by Athena, art and its indirect ways of knowing can allow us to approach realities which, if looked at directly, turn something inside us to stone. Art can call us back from strategic calculations about which message will play best with which target group, insisting on the tricky need for honesty – there’s a line I kept coming back to, from the playwright Mark Ravenhill, that your responsibility when you walk on stage is to be ‘the most truthful person in the room’. Art can teach us to live with uncertainty, to let go of our dreams of control. And art can hold open a space of ambiguity, refusing the binary choices with which we are often presented – not least, the choice between forced optimism and simple despair.

    These are strange answers. For anyone in search of solutions, they will sound unsatisfying. But I don’t think it’s possible to endure the knowledge of the crises we face, unless you are able to draw on this other kind of knowledge and practice, whether you find it in art or religion or any other domain in which people have taken the liminal seriously, generation after generation. Because the role of ritual is not just to get you into the liminal, but to give you a chance of finding your way back.

    Among the messages of the liminal is that endings are also beginnings, that sometimes we need to ‘give up’, that despair is not a thing to be avoided at all costs – nor a thing to be mistaken for an end state. 

    * * *

    Somewhere in the tumbling days that followed the US election, I saw it go by in the stream of social media. ‘It’s basically Breitbart vs Dark Mountain now, isn’t it?’ someone wrote, like we’re the last ones left whose worldviews aren’t in smithereens after the year that just happened. And like a few things in 2016, it had the taste of a bad joke that might have more truth in it than you’d want to be the case.

    In the last weeks of the year, as we were putting together this series of reflections, a discussion got started among the Dark Mountain editors about what the role of this project should be, in the years ahead. Bad jokes aside, it’s clear that the work we’ve been doing has taken on a new relevance, and with that comes a sense of responsibility.

    A couple of things are clear. The books we publish will always be at the heart of this project – and the work of artists, the makers of culture, will always be our starting point.

    Every year, thousands of copies of our books go out to readers around the world. By the standards of an independent literary journal, it’s an achievement, and it’s through the sale of our books that we’re able to pay for some of the work that goes into Dark Mountain. (The rest of the work, as you can imagine, is a labour of love.) 

    A sobering realisation this autumn, though, was that the audience coming to this website each year is a hundred times the size of the number of people ordering the books. There’s nothing wrong with that, of course – but over the years, we’ve given only a fraction of the attention to this site that goes into each of our print issues.

    So we came to the conclusion that it’s time to do something online that comes closer to the richness of the books we publish (and will go on publishing). Exactly what form this takes, we’re still working on – but it’s going to be an online publication, something more and different to a blog – and a site that reflects more of the web of activity of the writers, thinkers, artists, musicians, makers and doers who have taken up the challenges of the Dark Mountain manifesto.

    To make this happen, we need your help. 

    We’re asking for donations to cover the costs of building and launching a new online home for Dark Mountain. You can send a one-off amount, or set up a small monthly subscription – or if you’d like to talk about other forms of support, then you can get in touch. Everything you need to know is here, on our new fundraising campaign page.

    How ambitious we can be with the next phase of Dark Mountain depends on the level of support we get, so at this stage we’re not setting a fundraising target or a deadline – but we’ll tell you more as we go along. 

    Meanwhile, thank you for reading and sharing the work we publish. From the crowdfunding of the manifesto onwards, everything Dark Mountain has done over the years has been made possible by the support of friends, collaborators and readers. We don’t take that for granted – and wherever things go next, however dark it gets, we’re thankful for the journey we’ve been on with you.


    Published on the Dark Mountain website as the closing essay in a series reflecting on the political events of 2016 — and to launch the campaign that crowdfunded the new online edition of Dark Mountain. Over the following six months, we succeeded in raising over £37,000 to fund the creation of a new online edition which launched in June 2018.

  • How to Deal With ‘The Nazi Philosopher Martin Heidegger’ When Writing for a General Audience

    I’m no philosopher, but I sometimes drink wine with philosophers, and by the time you get onto the third or fourth bottle, the conversation often comes around to the uncomfortable case of Martin Heidegger.

    For my fellow non-philosophers, I think I can sum this up by saying: there’s this guy who is widely (not universally) considered to be one of the greatest philosophers of the 20th century, but unfortunately he was also a Nazi — a Nazi who lived until 1976, but never got round to apologising for his enthusiasm for Hitler.

    (You can just imagine how much ink has been spilt over this, but as a starting point, here’s the relevant section of his Wikipedia entry — while this from Joshua Rothman at the New Yorker gives you a flavour of the angst that philosophers go through.)

    I find Heidegger’s style almost as unbearable as his politics, and probably for that reason he has had little influence on my thinking. But I’ve worked with people like David Abram and Tom Smith who have no sympathies with the politics, but find intellectual nourishment in other parts of his thinking. So I’m willing to accept that there may be things there worth drawing on.

    (For what it’s worth, I suspect I found my equivalent nourishment in the work of Ivan Illich, who also offers deep critiques of technology and modernity, and for whom the concept of ‘home’ was also important — but who gets to this via pre-modern traditions of philosophy and theology, rather than leaning on Heidegger. That would be Illich who, aged thirteen, was called out in front of the classroom in Vienna and made to stand in profile, as the teacher pointed to his nose and told his classmates, ‘This is how you spot a Jew.’ Just saying.)

    Anyhow, as an editor at Dark Mountain — where technology, modernity and the concept of ‘home’ are among the themes taken up by our contributors — I’ve struggled periodically with texts that are written for a general audience and draw on Heidegger without acknowledging his politics.

    Basically, here’s how I see it: if you introduce Heidegger to a general reader with enthusiasm and don’t mention his unapologetic Nazism, sooner or later that reader will find out and feel betrayed. At which point, they will question your judgement — and possibly your political motives.

    What got me writing about this today is a new essay from Charles Leadbeater at Aeon which is a great example of how to do this right. The whole essay is worth reading, but here’s the bit that’s relevant:

    The philosopher who understood this search best is controversial: Martin Heidegger. A member of the Nazi Party, Heidegger never expressed remorse for the Holocaust and was often an arrogant, duplicitous bully. Some critics argue that his philosophy is too contaminated by racism to admit rescue. His ideas are often dismissed as parochial, nostalgic and regressive. Even his advocates acknowledge that his prose is deliberately dense.

    Yet, as the Australian scholar Jeff Malpas has shown in several thoughtful books and essays, studying Heidegger helps to explain why we are now so preoccupied by feelings of displacement that are triggering a search for home. Given Heidegger’s Nazi leanings and the rise of the populist Right in many parts of the developed world, his work could repay study.

    From here, Leadbeater is able to go further into what he — and Malpas — get out of Heidegger’s thinking, but the reader has not been set up for a horrible discovery at a later date. The thing that everyone needs to know when they engage with Heidegger has been stated clearly upfront.

    I realise that, if Heidegger’s work matters to you, you’re probably sick of having to make the argument that his politics doesn’t render the rest of it off-limits.

    When you’re writing in an academic context, it’s fine to assume that everyone knows the background — though please don’t make this mistake when teaching. (As Chenoe Hart pointed out in a discussion about this on Twitter this morning, you can go through architecture school hearing loads of discussion about Heidegger and never learn about the Nazism bit.)

    And OK, I’m not actually saying you should always refer to him as ‘the Nazi philosopher Martin Heidegger’, but given that it’s not unusual — for example, in this great piece by Neil Fitzgerald — to read about ‘the Marxist philosopher Slavoj Žižek’, you might consider doing it now and then.


    First published on Medium.

  • When the Maps Run Out

    I have been thinking about the slipperiness of history, how it escapes our grasp. When we study a war in school, the first facts we learn are the last to be known to anyone who lived through it: when it was over and which side won. Those who do not remember the past may be condemned to repeat it, but hindsight is very nearly the opposite of memory. To remember is to be returned to a reality that was not yet inevitable, to recall the events which shaped our lives when they might still have gone otherwise.

    The Dark Shapes Ahead’ (2012)

    The world is in flames and if you think it’s all the fault of those people — the uneducated, the bigoted — I urge you to think harder.

    When the values of social liberalism got hitched to the mercilessness of neoliberalism, it kindled a resentment towards the former among the latter’s losers. The deal was summed up in Alan Wolfe’s formulation: ‘The right won the economic war, the left won the cultural war and the centre won the political war.’ He said that in 1999. It was under Bill Clinton’s presidency that the ‘centrist’ settlement between progressive cultural values and There Is No Alternative economics was consummated. Two decades on, that made Hilary Clinton the dream opponent for a candidate running on the fuel of resentment.

    Here’s a stony truth to stomach: today, across the western countries, the culture war to defend the real social achievements of the past half century is grimly entangled with a class war against the losers of neoliberalism.

    If we now lose many of the unfinished achievements of the struggles against racism, sexism and homophobia, the Clinton generation of politicians will share the responsibility.


    I came home on Tuesday thinking Clinton was going to win, just like I came home in June thinking Britain was going to vote Remain.

    It turns out you can spend the best part of a decade talking “collapsonomics”, writing about the dark shapes ahead and the unravelling of the world as we have known it, and still let yourself get lulled into believing the status quo will hold a little longer.

    It helps that I voted Remain. I would have voted Hilary if they gave the rest of the world a vote.

    Still, the day after the referendum, when everyone was sharing that chart that showed that Remain voters were better educated, it filled me with an anger that stopped me writing. Were so many of you really so blind to the link between education and privilege?

    Back before I was that Dark Mountain guy, I worked as a local radio reporter in a city in the north of England. In the newsroom one day I saw a set of figures that are fixed in my memory: among 19 year olds with a home address in the leafy suburban southwest of that city, 62% were in higher education; on the council estates and terraced streets to the northeast, where my sister lives, the number was 12%. A kid from the right side of town was five times more likely to get to university than a kid from the wrong side of town. That’s when I got it: for all the other things it does, the major social function of higher education today is to put a meritocratic rubberstamp on the perpetuation of privilege.

    All those posts pointing out that graduates voted Remain, they seemed to imply that the higher you climb the ladder of education, the further you can see, the better equipped you are to make important decisions. But there are truths that are seen more clearly from below. Which side of town would you imagine has a clearer picture of the link between education and privilege?


    On Twitter right now, pundits who seem unhumbled by all the ways they didn’t see this coming throw around snapshots of exit polls to prove that this was or wasn’t about misogyny, racism, or a working class revolt.

    Start with a different set of numbers.

    Last September, the economists Anne Case and Angus Deaton published a study that showed that the death rate for middle-aged white Americans had started rising back in 1999. For every other group in the population, the death rate continues to fall. Among middle-aged white Americans, it is those who left education earliest who are doing most of the dying. They are dying of suicides and overdoses, alcohol poisoning and liver disease. The number of deaths is on a par with the AIDS epidemic at its height, but the causes bring to mind another historical parallel, to Russia in the years after the fall of the Soviet Union. Yet this American fall has taken place uneventfully, almost unnoticed, even by the gatherers of statistics: by their own account, Case and Deaton stumbled on their findings by accident.

    In March, after Super Tuesday, the Washington Post plotted the death data against the primary results. In eight out of nine states, they found a correlation: the counties where death rates for middle-aged whites were the highest were the counties where the vote for Trump was the strongest.

    I don’t know how you can look at that and say that Trump’s election is only about racism and misogyny, that it is not also a consequence of something that has been going terribly wrong in the lives of those white Americans with the lowest cultural capital.


    All year I’ve been watching sensible respectable well-paid commentators flailing to catch up with the collapse bloggers: these fringe thinkers off the internet, narrators of America’s long decline, people I’ve been reading (and occasionally publishing) for a decade or so, were the one group whose models of reality could handle what was happening.

    John Michael Greer lives in the Rust Belt and writes The Archdruid Report, a blog that rolls out every Wednesday, a kind of midweek sermon on nature, culture and the future of industrial civilisation. He called the election for Trump back in January. He is an actual archdruid, as it happens — as well as an SF novelist, a freemason and a self-described ‘moderate Burkean conservative’.

    In a series of posts this year, he sketched out a take on the long backstory to this election which goes something like this:

    Politics is about how a society deals with the collision between the interests of different groups. The great contribution of the liberal tradition was to show that politics can also be about values — but the corruption of that comes at the point when values are used as a cover for interests.

    The policies of globalisation, the deindustrialisation of the US economy and its increased reliance on illegal immigration as a source of cheap labour, were the result of political choices. These choices served the interests of those Americans with salaries and a higher education, while going against the interests of wage-workers. But instead of this collision of interests being negotiated within the political sphere, the results of these policies were presented as inevitable and universally desirable. In particular, any attempt to talk about whose interests were served by the role of illegal immigration was immediately derailed into an argument about values where anyone questioning immigration was accused of racism.

    Trump’s campaign played on this in two ways. First, by deliberately outraging the socially liberal values which had become so entangled with the interests of the salariat, he could build a rapport with other parts of the electorate. Then, by focusing on immigration, jobs and protectionism, he gave those voters a sense that their interests might actually have found a political vehicle.

    The danger of this kind of analysis is that it downplays the uglier forces on which Candidate Trump fed for his success, the forces which President Trump will embody. But it gives you a sense of how the election can have looked in the Rust Belt towns, to the low income white Obama voters who swung to Trump, in the places where all that dying is going on.

    More than anyone else I’ve read this week, Greer seems persuaded that the dangers of a Trump presidency have been overstated. It’s possible to hope that he is right, I suppose — and, meanwhile, to assume that he is wrong and prepare accordingly.


    The blogger who goes by Anne Tagonist (or sometimes Anne Amnesia) is less sanguine. ‘What Trump’s boys have for me is a noose,’ she wrote, back in May, ‘but that’s the choice I’m facing, a lifetime of gruelling poverty, or apocalypse.’

    Yeah I know, not fun and games — the shouts, the smashing glass, the headlights on the lawn, but what am I supposed to do, raise my kid to stay one step ahead of the inspectors and don’t, for the love of god, don’t ever miss a payment on your speeding ticket? A noose is something I know how to fight. A hole in the frame of my car is not. A lifetime of feeling that sense, that “ohhhh, shiiiiiit…” of recognition that another year will go by without any major change in the way of things, little misfortunes upon misfortunes… a lifetime of paying a grand a month to the same financial industry busily padding the 401k plans of cyclists in spandex, who declare a new era of prosperity in America? Who can find clarity, a sense of self, any kind of redemption in that world?’

    When I interviewed Anne for the last Dark Mountain book, I learned a little more about her background in zine writing and travelling and roads protests, working as a street medic, then on ambulances, and from there to medical school and research. She doesn’t write so often, but when she does, what I appreciate is her willingness to puzzle through a question, to include her uncertainties, rather than making a neatly rounded argument.

    And that post in May was scorching. It starts with the Case-Deaton death rate study, but seen through the eyes of someone living in one of those counties, someone who has been sitting in with the Medical Examiner:

    A typical day would include three overdoses, one infant suffocated by an intoxicated parent sleeping on top of them, one suicide, and one other autopsy that could be anything from a tree-felling accident to a car wreck (this distribution reflects that not all bodies are autopsied, obviously.) You start to long for the car wrecks…

    Unlike the AIDS crisis, there’s no sense of oppressive doom over everyone. There is no overdose-death art. There are no musicals. There’s no community, rising up in anger, demanding someone bear witness to their grief. There’s no sympathy at all. The term of art in my part of the world is “dirtybutts.” Who cares? Let the dirtybutts die.

    You know, I could just repost every other paragraph of that piece here, but really you should go read the whole thing.

    From where I live, the world has drifted away. We aren’t precarious, we’re unnecessary. The money has gone to the top. The wages have gone to the top. The recovery has gone to the top. And what’s worst of all, everybody who matters seems basically pretty okay with that.


    Is this OK, I wonder, just bombarding you with a reader’s digest of the apocalypse?

    It’s not the apocalypse, of course, it’s just history, but if you thought the shape of history was meant to be an upward curve of progress, then this feels like the apocalypse.

    Midway through the night, when the New York Times projection had slipped from Likely to Leaning to Tossup, as I broke open the whisky and let rip on Twitter, my friend Chris T-T replied, ‘I love that your reaction to fear is a splurge of analysis.’

    There’s a rawness in the aftermath of nights like that, a sense that the callused outer skins of our grown-up selves have been ripped off. For a day or two, maybe longer, we can feel things with the intensity of children again. (Or as someone in my timeline wrote, ‘The OH FUCK! comes in waves.’)

    It reminds me of the conversations that sometimes happen in the last days of a life, or on the evening of a funeral. In the underworld of loss, we don’t get to bring our achieved identities with us, so there’s a chance of getting real.


    The morning after last year’s unexpected Conservative election victory in the UK, I wrote some notes on how to make sense of the loss. As political bereavements go, it looks quaint now by comparison — don’t you feel nostalgic for when the worst thing that could happen was waking up to find David Cameron was still prime minister? But one thing from that post sticks out, the part where I was building on a line from the mythographer and storyteller Martin Shaw: ‘This isn’t a hero time, this isn’t a goddess time: it’s a trickster time.’

    When people like John Berger (one of my heroes) were young, it was a real thing to believe in the heroic revolution that Marx had seemed to promise. Today, the only kind of revolution that is plausible is a foolish one, one where we accidentally stumble into another way of being human together, making a living and making life work. (And whatever that might look like, it doesn’t look like utopia.)

    I wrote that thinking of the weird cameo role that Russell Brand had been playing in British politics: not thinking of him as a candidate to lead a trickster revolution, only as a clue to the motley in which change would need to come in a time like this.

    I’m pretty certain it was Ran Prieur, another of the collapse bloggers, who put me onto the idea of Trump as trickster, but the best treatment of that thought I’ve found is Corey Pein writing for the Baffler.

    He starts with the story of Allen Dulles, later the director of the CIA, who recruited Carl Jung as an agent during World War II to provide insights on the psyche of Hitler and the German public. ‘Nobody will probably ever know how much Prof. Jung contributed to the Allied cause during the war,’ Dulles wrote afterwards. We do know that, in an essay in 1936, Jung had written, ‘the unfathomable depths of Wotan’s character explain more of National Socialism than all [proposed] reasonable factors put together.’ (As Pein goes to some lengths to acknowledge, such thoughts are quite a stretch for the early 21st century western imagination: if you’re struggling, try telling yourself, ‘Obviously Jungian archetypes are just metaphors,’ and then remove the ‘just’ from that statement.) If Wotan could be awoken in the collective psyche of a nation, Jung added, then ‘other veiled gods may be sleeping elsewhere.’ Which is how Pein comes to Trump:

    Just as Hitler was not known to crack wise from the podium, Trump’s stump speeches do not call to mind ‘storm and frenzy.’ Trump is no Wotan, no berserker — he is a wisecracker, adept in the cool medium of television. He represents an entirely different Jungian archetype — namely, the pan-cultural mythological figure of ‘the trickster,’ who arrives at moments of uncertainty to bring change, often of the bad kind.

    Pein is being a little unfair on the trickster here, I think. Lewis Hyde gives a subtler account in his marvellous book, Trickster Makes This World. He identifies trickster as a low status character within the local pantheon of a culture, a mischievous messenger boy, a nuisance under normal circumstances, but who takes on an altogether more important role in moments of deep cultural crisis: when those who hold high status within the existing order of things are helpless, trickster can shift the axis, find the hidden joke that allows the culture to pass through into a new version of itself.

    If you’ll grant that such uncivilised ways of thinking could help us make sense of political events, I’ll tell you that Donald Trump is a shadowy parody of a trickster. That takes me to something the poet Nina Pick says in a conversation in the latest Dark Mountain:

    We’ve lost the power of metaphor. You can see it in American politics at the moment for example; there’s a deficit of imagination, of the imaginal life, of myth… and without that level of myth and of metaphor I think we start to get lost as a culture.

    When we lose sight of myth and metaphor, we don’t leave it behind, we just become unaware of the ways in which it is still at work in our culture.

    Or, as Martin Shaw, who set me thinking about all this, would put it:

    The stories that we are being fed now are not myths. They are what I would call, toxic mimics. But when we are deprived of the real thing, we will take even an echo and grab on to it. So in other words, the most horrible lies always have a little bit of truth in them.

    So there you have it, that’s my hot take: Donald Trump is a toxic mimic of Loki.


    At this point, there are a couple more things we need to talk about, before I try and leave you with some blessing for the dark times that are gathering around us.

    There’s something more to say about the work that lies ahead, if it’s seriously the case that we are in territory where archdruids and zine writers and collapse bloggers and mythtellers are the ones who still have maps that seem to make sense.

    But first, we need to talk about Hitler.


    If there is any meaning left in a word like fascism, then let’s call Trump a fascist.

    Heck, even John Michael Greer’s first take on the Donald’s campaign, back in the summer of 2015, was that it ‘is shaping up to be the loudest invocation of pure uninhibited führerprinzip since, oh, 1933 or so.’

    But it’s worth lingering over that ‘if’… Words like ‘fascist’ are mostly used these days as a stop to thinking, a shorthand that saves us the work of knowing our enemy.

    Anthony Barnett walked this line in an essay for Open Democracy, the night before the election:

    It is essential to be able to distinguish between different kinds of evil and judge them accordingly… As a rule, therefore, never talk about ‘fascism’ or ‘Stalinism’ in political or polemical writing… They are used to mobilise an attitude that pre-empts scrutiny. And even interest. If something is fascist we should be able to ask what kind it is and how bad it might be, but the concentration camps make such an approach taboo.

    ‘For the first time,’ he goes on, ‘I break the rule.’

    And if the hesitation adds force to his doing so, it also leaves room for a qualification. Trump is a fascist, Barnett writes, but unlike Hitler, he does not have financiers, storm-troopers or an organised movement. What he now has is the office of President of the United States and a seemingly compliant legislature.

    Another line of caution about the Hitler comparison comes from another of Anne Tagonist’s essays — written just after Super Tuesday, when she was already taking the likelihood of a Trump presidency seriously — in a genre she calls ‘clumsy writings about why history doesn’t work the way you think it does.’

    The systematic study of mass behaviour, she points out, is largely a post-World War II phenomenon.

    In 1945, Germany was in ruins, the world had entered the atomic age and the cold war, Americans were starting to realize exactly how many civilians had been exterminated in “labour” camps, and yet no consensus narrative had emerged how such an unthinkable sequence of events could have happened… The Third Reich was a very good reason to go out and learn more about how humans behaved in groups.

    And so, with the contributions of Adorno, Arendt, Milgram and others, within twenty years, an intellectual consensus emerged about how Nazism had come about, how it had achieved such adoration and power, and how it enlisted so many Germans in the systematic perpetration of horrors.

    We had a system custom-built to explain the Nazis, that explained the Nazis. A side effect is that now, every large-scale bad social movement looks a bit like the Nazis.

    Remember, she is not making this argument to tell us there’s no need to worry, this is Anne who also wrote that ‘What Trump’s boys have for me is a noose.’ What she is getting at is the danger of readying ourselves to fight the last war. Literally.

    Actual historians don’t tend to think history repeats itself, or if they do, they find celebrated yet incomplete examples that don’t assume the world began a century ago and only one bad thing every happened in it.

    But OK, let’s say that this is our January 1933.

    We don’t know the shape of the war that could be coming, nor how that war will end, and not only because we cannot see the future, but because it hasn’t happened yet: there is still more than one way all this could play out, though the possibilities likely range from bad to worse.

    Among the things that might be worth doing is to read some books from Germany in the 1920s and 30s, to get a better understanding of what Nazism looked like, before anyone could say for sure how the story would end.

    Another thought, from that post I quoted at the start, written four years ago, on a journey I made in search of cultural resilience:

    If someone were to ask me what kind of cause is sufficient to live for in dark times, the best answer I could give would be: to take responsibility for the survival of something that matters deeply. Whatever that is, your best action might then be to get it out of harm’s way, or to put yourself in harm’s way on its behalf, or anything else your sense of responsibility tells you.

    Some of those actions will be loud and public, others quiet, invisible, never to be known. They are beginning already. And though it is not the bravest form of action, and often takes place far from the frontline, I believe the work of sense-making is among the actions that are called for.


    I notice that there is a part of me that would like not to be serious, that would like it to be secretly a bluff, a puffing of the ego, when I say that it feels like there’s a new responsibility landing on the ragtag of thinkers and tinkers and storytellers at the edges, one edge of which I have been part of over these last years. And for sure, this is only one map I’ve been sketching, others will have their own that may or may not overlap.

    But the way it looks from here tonight, the people who are meant to know how the world works are out of map, shown to be lost in a way that has not been seen in my lifetime, not in countries like these.

    I am thinking of one of the smartest, most thoughtful commentators on the events of this year, whose analyses have helped many of us make sense of what Brexit might mean, the director of the Political Economy Research Centre at Goldsmiths, University of London, Will Davies. In an article for the Washington Post, a week after the referendum, he drew the parallel to the Republican primaries. He too had picked up on the Case-Deaton white death study and the correlation between mortality rates and Trump support.

    ‘Could it be that, as with the British movement to leave the EU, Trump is channelling a more primal form of despair?’ he asks. But as the article approaches a conclusion, the despair seems to have spread to its author. ‘When a sizeable group of voters has given up on the future altogether… how does a reasonable politician present themselves?’

    All of this represents an almost impossible challenge for campaign managers, pollsters and political scientists. The need for candidates to seem ‘natural’ and ‘normal’ is as old as television. Now it seems that they also need to give voice to the private despair of voters for whom collective progress appears a thing of the past. Where no politician is deemed ‘trustworthy,’ many voters are drawn toward the politician who makes no credible pledges in the first place. Of course government policy can continue to help people, and even to restore some sense of collective progress. But for large swaths of British and American society, it seems best not to state as much.

    As I read them, these are the words of a person who is running out of map, though one who gets closer than many to seeing how deeply the future is broken, how far the sense of collective progress is gone.

    While the victorious political centre of the Clinton and Blair era has gone on insisting that everything is getting better and better, some of the smartest thinkers on the left have recognised the breakdown of the future and responded by setting out to reboot it, to recover the kind of faith in collective progress that made possible the achievements of the better moments of the twentieth century.

    If their attempts have struggled to gain traction, one reason may be that the left is better at recognising the economic aspects of what has gone wrong than the cultural aspects, which it tends to ignore or bracket under bigotry. There are great forces of bigotry at work in the world, they will have taken great encouragement from Trump’s election and they need to be fought, as they have been by anti-fascist organising in working class communities, again and again. As a teenager in the northeast of England, my first activism was going out on the streets against the British National Party with Youth Against Racism in Europe. Still, without pretending that they can be neatly disentangled, there are other aspects of what has gone wrong that belong under the heading of culture, besides racism and xenophobia.

    In the places where it happens, economic crisis feeds a crisis of meaning, spiralling down into one another, and if we can only see the parts that can be measured, we will miss the depth of what is happening until it shows up as suicide and overdose figures.

    Without a grip on this, the left has struggled to give voice to those for whom talk of progress today sounds like a bad joke. And yeah, maybe Bernie could have done it — he’d surely have been a wiser choice for the Democrats this year — but the thing is, we’ll never know. Meanwhile, across the western countries, too often, the only voices that sound like they get the anger, disillusionment and despair belong to those who seek to harness such feelings to a politics of hatred.

    This is where I intend to put a good part of my energy in the next while, to the question of what it means if the future is not coming back. How do we disentangle our thinking and our hopes from the cultural logic of progress?For that logic does not have enough room for loss, nor for the kind of deep rethinking that is called for when a culture is in crisis. But that is another story, and a longer one even than this text has become, and I must get up in a few hours’ time to go talk about that story with a conference full of hackers.


    On Wednesday morning, the snow was falling hard. Before I finally got to bed, I had given my son breakfast and taken him to kindergarten, pushing his buggy through the snowstorm. Last time we had snow, he was still a baby wrapped inside a pram: now he is fifteen months and discovering everything. After dinner that night, he danced with me and we laughed together like fools.

    I want to say that this is also history, though it doesn’t get written down so much: the small joys and gentlenesses, the fragments of peace, time spent caring for our children, or our parents, or our neighbours. These tasks alone are not enough to hold off the darkness, but they are one of the places where we start, one of the models for what it means to take responsibility for the survival of things that matter deeply.

    Fifteen months and every day now he is playing with new words in his mouth. I can see the time coming when the words become sentences and questions, when he starts to want the world explained to him.

    ‘How can I get through it?’ a friend asked.

    This was earlier that morning, before the snowstorm.

    ‘We’ll get through because we have to,’ I wrote, ‘the way we always have, one foot in front of another. Hold those you love tight. Be kind to strangers.’

    ‘I’m really not looking forward to telling my kid he lives in President Trump’s country,’ another friend wrote.

    ‘Our kids are going to be the ones who get us through this,’ I told her. ‘That’s how long this journey will take.’

    What am I doing here, I wonder now? I don’t even live in America. Though somehow we all live in America, because it fills our ears, spills out of screens and teaches us to dream. But also because we can feel it coming, see the same gaps widening in our own societies, watch the same complacency or helplessness on the faces of the old leaders and the ugly smiles of those who are sure their time is coming.

    Everyone who said they knew what they were doing has failed. How badly things turn out now, we can’t say for sure. But there is work to be done.


    First published as Issue 11 of Crossed Lines, my occasional email newsletter.

  • Spelling it Out

    Spelling it Out

    On the desk at which I write there lies a wand. At least, this is how I have thought of it, since the afternoon, five or six years ago, when it came into my hands: thirteen inches of fenland bog oak, turned on a pole lathe, its tip the shape of an acorn.

    I’d slept the night at a friend’s house in Peterborough and, before dropping me at the station, he wanted me to see the Green Backyard. Even in the short time I had to walk around the site and chat over a cup of tea, I got why. There’s a particular magic that encircles certain projects, so strong that you can smell it. I think of the Access Space media lab in Sheffield, or the West Norwood Feast street market in south London, owned and run by the local community.

    By invoking the idea of ‘magic’, I want to point to a quality which these projects share. At their heart is something that is obvious, yet beyond the grasp of the logic of either the private or the public sector, because their existence would be impossible without the active involvement of people who are doing things freely, for their own reasons, rather than because they have been paid or told to do so. A parallel vocabulary has grown up to cover this kind of activity — its initiates speak of ‘the third sector’, ‘civil society’, ‘social capital’ and so on — but my suggestion is that, while it may have its uses, such language misses much of what people experience as distinctive about places such as Access Space or the Green Backyard. (Nor is it quite covered by the older language of ‘volunteering’.)

    I could go further in elaborating this distinctiveness and the way it eludes expression in a formal language — and I would do so by locating this kind of activity within the logic of the commons, as distinguished from the entwined logic of public and private. As Ivan Illich writes of the customary agreements which governed the historical commons of England, ‘It was unwritten law not only because people did not care to write it down, but because what it protected was a reality much too complex to fit into paragraphs.’ This complexity did not present a problem for those involved in commoning — and, as Elinor Ostrom demonstrates conclusively, Garrett Hardin’s much-cited assertion that commoning ends inexorably in tragedy was a crude libel. Rather, it is to those who would govern, manage or exploit from above that the ‘illegibility’ of the commons appears as a problem. In any attempt to simplify the complex human fabric of a commons into a written framework, what Anthony McCann calls ‘the heart of the commons’ is likely to go missing.

    This line of argument may go some way to explain the difficulties that ensue when those responsible for such projects find themselves having to deal with systems and institutions whose reality consists of that which can be written down, measured, counted and priced. Yet, in spelling this out, there is a danger that it comes to read as an argument against any attempt at collaboration with the public or private actors with which such projects often find themselves having to coexist, and this too would be a simplification. Instead, in the notes that follow, I want to share a way of thinking about the trickiness of language that has grown out of my own experience of helping to bring such projects to life.


    So I take the wand, or whatever it is, and draw a shape in the dust. This is not an authoritative model, only the kind of map that one friend might draw for another on the back of a napkin, trying to pin down an experience that is just starting to make sense.


    I have been carrying this model around for a couple of years. It came out of conversations with a friend with whom Anna Björkman and I were beginning a collaboration here in Sweden, and out of Anna’s experiences working with grassroots women’s organisations in Israel and Palestine. We needed a way to make sense of the shifting terms in which we found ourselves talking about the same project. It gave us a shared reference point to make sense of which language was appropriate to which context, how and when to move between them.

    It also offers a way of mapping a set of problems that you may have encountered in your own work or in the work of people and organisations with whom you have had dealings.

    For example, you might recognise the kind of project which has an Upward language but no Inward language, which appears to have been constructed entirely for the purposes of accessing funding and resources, with no underlying life to it. Whole organisations seem to exist to create such projects, serving little other purpose.


    Another situation is the project which has an Inward language but no Outward language. Most likely, this means that the project is not yet realised.

    The poet W.B. Yeats — no stranger to magic — once wrote, ‘In dreams begins responsibility’, and this can serve as a motto for the process by which an idea comes to life. At the start, there is a spark: a moment when you see each other’s eyes light up and the conversation quickens, or you catch sight of an opening and turn towards it. A long and indirect journey lies between this and the time when the idea has become something ‘out there’, something you can point to, something people can tell each other about — by which time, the fluidity of dreams has given way to the heaviness of responsibilities, paying bills and filing accounts.

    Often, you are some way on in this journey before the project has anything resembling an Outward language, and the words you use to explain it to outsiders may change many times before they settle into shape. The lack of a satisfying Outward language is not a problem to a project that is still making its way into being, though it may cause problems for those involved, if they are asked to explain why they are devoting their time and energy to it.

    However, in the absence of an Outward language, be cautious about attempting to explain a project that exists mostly in your dreams and schemes to a neutral audience. The Inward language is like a set of in-jokes: to those involved, it is a web of meaningful connections, but to the uninitiated it is just boring. In the worst case, this hardens into the phenomenon of those ancient mariners who haunt certain kinds of conference, keen to talk you through a PowerPoint deck the length of a Victorian novel which explains their model of the world and how it could be bettered. I don’t doubt that at the root of each such model lies a powerful experience of insight, but I would rather eat your cake before I decide whether I am interested in the recipe, and if you keep trying to feed me recipe after recipe, I may begin to wonder if you actually know your way around an oven.

    To get far enough inside another person’s model of the world that you can feel for yourself what it makes possible is a considerable undertaking. Around the projects with which I have been closely involved lies an improvised scaffolding of ideas — chunks of Keith Johnstone’s improvisation theory, Brian Eno’s notion of ‘scenius’, a back of an envelope version of John McKnight’s Asset-Based Community Development, swathes of the work of Ivan Illich, odd lines scavenged from poets, conversations that Anna and I have around the breakfast table — and in any particular project, these will be bound up with the thoughts and experiences of others with whom I am working. If you really want to know about this stuff, as we get to know each other, I’ll map out corners of it with you, rather as I am trying to map out one particular corner in this text. But the projects themselves must stand or fall without the scaffolding, or nothing has been built.


    One last case, before we sweep away the dust and the triangle with it.

    From time to time, I come across a project which has made the journey to the everyday world of responsibilities without losing sight of the dreams in which it began, which has a lively Outward language and shows signs of an Inward language — not densely scaffolded with footnotes, necessarily, but rich in meaning — and which has reached a point where increased contact with larger institutions and structures is necessary, often because its success makes it no longer possible to operate below the radar.

    If such contact is not to end badly, an Upward language is required, and guides are found to help navigate these colder and unfamiliar waters. These guides offer a formal terminology in which to describe the activities of the project, words which carry authority and which offer a legibility that may also contribute to the development of the Inward language, especially if this has tended to rely on the implicit, on things that are understood without even being put into words.

    The caution here is twofold. First, the authority of such words should not be treated with too much respect. The knowledge and understanding which those involved in the project already have is what brought the project to life — and while there are expert languages which are good at naming and describing the processes by which things come alive, these languages tend to be sterile in themselves. Make use of them, where they help, but do not treat them as seriously as they seem to want to be treated.

    Secondly, guard against the intrusion of the Upward language into the Outward. If it helps with funding applications to deploy words like ‘sustainability’, ‘innovation’, ‘learning platform’, ‘resilience’, ‘impact’ or whatever this year’s keywords are for the structures with which you need to interface, then by all means use them. Just don’t use them when you speak with or write for other human beings.

    It is here that Jessie Brennan’s work with the Green Backyard can offer an example. Art has its own tangle of languages, of course, but here the artist takes on the role of the listener, making time to go beyond the first answers that people might give to a survey or a journalistic vox pop, getting closer to the heart of why a project matters to the people who come into contact with it, then drawing out the words that sing to her and giving them voice in new forms. Not every project has the benefit of such a resident, but every project that has come alive has stories and voices like this, and will reward the patience of someone who takes on the role of the listener. This is where you find an Outer language, by listening to the way that people tell each other about what you are doing, looking for the words that seem to travel.


    What I remember from that brief visit to the Green Backyard is the web of lives and skills woven together into the project: the farmer who was persuaded to bring his tractor down to plough up part of the site; the offenders coming to work here as part of a community service order, some of whom went on coming back after their sentence was over; the graffiti kids painting boards around the site. The work of weaving together such unexpected combinations into a human fabric is a kind of gentle magic — and it is at its most powerful when grounded in place, as at that patch of former allotments in Peterborough, or the shipyard in Govan that is home to the Galgael Trust, or the acre of ancient ground in the Cheshire countryside where Griselda Garner and others weave together the Blackden Trust.

    Such projects do not play on a level field, but on fields that were enclosed generations ago and that are still being enclosed today by those who, like Garrett Hardin, want to insist that only privatisation can secure their future and that the public good is served by the maximisation of the kinds of value that can be reduced to a figure in a spreadsheet. Heartbreaking decisions often get made as a result, and even what looks like success can bring a danger of hollowing out. The land enclosures that climaxed in the 18th century were carried out in the name of ‘improvement’; today, the word would be ‘development’, but the dynamics are much the same. Yet if the value of the commons remains always partly mysterious to systems which can only deal with the legible, so too does their capacity for endurance and the strength which they give to those who live and work with them, and the process of enclosure is never quite as total as its promoters would like us to believe.


    First published in Re:Development: Voices, Cyanotypes & Writings from the Green Backyard by Jessie Brennan (Silent Grid, 2016).

  • Pockets: A Story for Alan Garner

    Pockets: A Story for Alan Garner

    There was a jigsaw we had when I was five, a map of Britain with illustrations of the places that matter. Two of these lodged in my imagination: the limestone wonder of the Cheddar Gorge, and the great dish of the radio telescope at Jodrell Bank. ‘We know the people who live next to Jodrell Bank,’ my mum told me, and this seemed a magical proposition. It was.

    By the time I started piecing together the jigsaw, our families were just about in Christmas card contact, but for a while in the early seventies, my mum had been a regular guest at Toad Hall. Her friendship with the Garners began on a children’s ward in Manchester, where she was nursing one of Alan’s daughters. Later, their hospitality became a place to turn in a dark moment of her life. The pieces of that story have come out slowly over the years, but from the way she spoke, I had the sense that these people and this place had shown her a great kindness. And when I finally found my own way up the bumpy track to Blackden, by which time I must have been about the age she was when she found refuge there, I knew that I was arriving at a place of sanctuary.


    Before that, there were the books. The Weirdstone, read for the first time on a rainy holiday in Swaledale, then racing on to the end of The Moon of Gomrath where the afternote was a first clue to the thoroughness behind the momentum of the telling. (‘The spells are genuine,’ Alan noted, ‘though incomplete: just in case.’) Like so many others, I was hooked, waiting for the arrival of the later books, returning to their pages and always finding more. There is something here that feeds a hunger in us, a hunger that is hard to name in the words our culture has to offer.

    There are no favourites, but one book stands out because I find it hard to know who I would be if it hadn’t turned up when it did. The Voice That Thunders was published the summer I was about to go up to Oxford and I carried it like a secret through the next three years. Under the bombardments of the graduate recruitment brigade, I would find shelter in Joseph Garner’s quietly brutal careers advice: ‘Always take as long as the job tells you’ and ‘If the other feller can do it, let him!’ (There was another spell, only this time with no safety catch, no words left out.) The effect of reading that book that summer was to awaken a sense of loss that was also a coming alive. As if a grief that had been a background greyness, taken for reality itself, was lifted into focus, could now be felt, honoured, lived through.

    For a bewildered young man from the north of England, entering the unforgiving world of Oxford, this was a kind of armour. It didn’t matter that I was unable to explain to my tutors or my peers why Alan’s work mattered so much. My explanations would have been too personal, unintelligible within the language we were being taught to use. When I suggested to Craig Raine that I write on Garner for the 20th century paper in Mods, he said it was a touching thought, but I should really focus on authors of the first rank, which revealed his ignorance and saved us both a deal of pain. (Though another tutor, the great Shakespearean A.D. Nuttall, gleamed when I mentioned Alan’s name.)

    A first-rate academic education often resembles a half-complete shamanic initiation. The initiate’s body of beliefs is cut to pieces, the head severed from the heart. She is taught to analyse or deconstruct anyone’s way of making sense of the world, including her own. Yet the institution overseeing this operation scarcely recognises the reconstruction that must follow, if the young person passing through its care is to emerge whole.

    In the depths of that initiation, little of what had come with me to Oxford still made sense, but these books did. They offered a refuge of meaning that I knew was not escapism. That their author had proven himself in the tutorial room and then chosen to walk away from this world was part of their power. What followed, in the journey from The Weirdstone to The Stone Book, was evidence that the severing need not be final, that head and heart could be brought back together, within our culture, even if the cost of this was indeed “total war, by which I mean total life, on the divisive forces within the individual and within society.”


    Later, by the fireplace at Toad Hall, Alan told me about the meeting with his tutor when he had made the decision to leave Oxford and try to write. ‘Do it,’ the tutor said, ‘and if you find that you don’t have what it takes, then come back next year, and no one will think the less of you. But if you find that you do, then you will have to create a Magdalen of your own.’

    That was what he had done, I thought — he and Griselda — in the net of fellowship that gathers around their kitchen table and stretches to the corners of the world. I was drawn into the net after a talk that Alan gave at the Temenos Academy in London. I had asked a question that caught his attention, then stayed behind to pass on greetings from my mum to Griselda. When she recovered from bouncing with excitement and discovered that I was working on something called the School of Everything, Griselda decided that I must be enlisted to assist the Blackden Trust.

    So I found myself bumping up that track to the house in the middle of a field, the telescope looming like a great white Grail behind it. As we walked from room to room, Alan told the stories of the place and handed me objects that I knew without ever having seen: the stone book, the little whizzler, the Bunty. Just as awe was in danger of taking over, the thought struck that this shy, funny, brilliant man was also still the boy in the wartime photograph, that he was sharing these treasures just the way a small child will make friends by sharing his toys.

    On the visits that followed, I got to know the Trust in action. It is a school in the truest sense: a place that offers the leisure to slow down, to deepen your attention, to notice the unexpected and to draw out its implications with rigour. Young people learn to look hard, to ask a question and follow where it leads, to test ideas and always to pursue the anomaly. They do so in the company of experts of the highest standing who are unafraid to display the limits of their knowledge or to explore their disagreements with good humour.

    I have sat in its grounds as we knapped flint, under the guidance of a professor of archaeology, listening as the conversation gave way to silence, as the rhythm of our tapping fell into unison and the realisation spread among the group that this sound was being heard on this spot for the first time in ten thousand years. Another time, when Ronald Hutton led a seminar on the Civil War and one of our group was moved to tears, I understood that it was possible to carry out the work of the historian, with all academic diligence, and at the same time to perform an older and more universal task: to honour the dead in such a way as to give meaning to the living.

    Ivan Illich once described the climate which he had sought to foster in the meeting places he had helped to create, and it is a description that makes me think of Blackden: ‘Learned and leisurely hospitality is the only antidote to the stance of deadly cleverness that is acquired in the professional pursuit of objectively secured knowledge. I remain certain that the quest for truth cannot thrive outside the nourishment of mutual trust flowering into a commitment to friendship.’


    Around that table, you never know what field the conversation will enter next, and it was on one of those evenings that I first heard talk of ‘cryptic northern refugia’. Once upon a time, a species like the oak was thought to have survived the last Ice Age only at the southern edges of Europe, from where it marched out again across the continent in waves, over centuries, to reseed the warming landscape. Now we know how fast that warming came — seven degrees in a decade, at the end of the Younger Dryas — and the palaeoecologists keep finding traces of plants and animals in times and places where they should not have been. So the old model has given way to a new hypothesis: in certain places, pockets of leafy woodland endured, protected by their own microclimates, harbouring isolated communities of creatures which would otherwise only have survived far to the south. These northern refugia were cryptic, so small as to barely leave a trace in the record, but the sites identified lie in steep-sided valleys, where high and low ground meet. Places such as Cheddar Gorge, or Ludchurch.

    There is a path that leads from here to Boneland, but I want to turn back instead to The Voice That Thunders and a glint of that vein of creative anger that runs through Alan’s work: an anger, by his own description, ‘at once personal, social, political, philosophical and linguistic.’ Addressing an audience of headteachers, invited to speak on ‘The Development of the Spiritual’, he issues a warning against the rise of a materialism which can see the world only through the lens of accountancy, which turns all to commodity, which appropriates competence in all fields of human affairs, from the classroom to the publishing house, and which, if unresisted, will usher in ‘a spiritual Ice Age’.

    Twenty years on, the ice has spread further across the social landscape, and few institutions are untouched. ‘The new world economic order,’ as John Berger terms it, is a totalisation of the process of enclosure which the land man brought to Thursbitch. What is the shape of hope in such a landscape? ‘The shape of a pocket,’ Berger answers. ‘A small pocket of resistance.’ The image is borrowed from Subcomandante Marcos of the Zapatistas. Its smallness reflects the distance both men have travelled from the grand historical expectations of revolution, their Marxism tempered by the experience of the peasants of the Haute Savoie or the indigenous people of Chiapas. Perhaps because Berger writes in the same book about the cave art of the Palaeolithic, I hear a rhyme between the political and the prehistoric. If there is hope left, in this Ice Age, it is in the hidden pockets, the refugia too small to seem significant.

    ‘Resistance is growing,’ Alan tells the headteachers. ‘Especially amongst artists.’ The enclosure is never quite total; the hills will outlast the walls. That which is supposed to be lost often turns out only to be dormant, marginalised, walking the edges, or gone underground. In the darkest hour, that which is meant to be obsolete may yet make all the difference. The Trickster spirit will always get aback of those who only see the things that can be measured, counted and priced.

    And in the meantime, there are always the pockets, the hidden corners of conviviality, the cryptic northern refugia, the places that matter. If that long-inhabited patch of ground across the railway tracks from the telescope at Jodrell Bank is such a place, the same is true of the pages of the magical books that have been written there.


    Published in First Light: A Celebration of Alan Garner, edited by Erica Wagner (Unbound, 2016)

  • We Are the Only Species We Have the Option of Being: A Conversation With Anne Tagonist

    We Are the Only Species We Have the Option of Being: A Conversation With Anne Tagonist

    A couple of weeks before COP21, I did an interview with an American radio station. They set me up against another guest, a professor at Yale who specialises in the psychology of climate communication. I don’t know what my credentials were meant to be, on this occasion, except that the producer said, ‘I spend a lot of time interviewing people about climate change and the things you say are the things the others only say after I’ve switched off the mic.’

    The ISDN line from the broom cupboard in Stockholm where I was sitting to their studio in Chicago kept dropping out, so I only heard half of what the guy they wanted me to argue with was saying. What stuck with me was a question that came from the host. I had been talking about the lifestyles that most of us take for granted, just now, in countries like these. I said, ‘I don’t believe these lifestyles are going to be made sustainable.’ The next time the host came to me, he asked, ‘So, in this future you’re talking about, how many humans are left at the end of the century? Are we talking a hundred thousand, a million?’

    The question threw me, I didn’t know how we had got here, but afterwards, as I went over the interview again, the best explanation seemed to be that he had taken my suggestion that the lifestyle of the western middle classes is going away and equated this to the elimination of 99.99% of the human species.

    For ten years or so now, I have been lurking around a few of the ‘collapse’ blogs, the corners of the internet where people think out loud about the end of the world as we know it. There are sites whose authors are loudly certain that this means imminent human extinction, but the ones to which I find myself returning are written by people who are trying to think around the edges of the world we have known, to catch sight of the unknown worlds that may lie around the corner. It is easy to imagine the apocalypse – easier than to sustain the belief that things can go on like this – but what is hard is to recognise the space between the two, the messy middle ground in which we are likely to find ourselves. At its best, like certain kinds of science fiction, the writing of the collapse bloggers provides a work-out for the historical imagination.

    Among these online conversations, one of the distinctive voices belongs to an American who usually writes under the name Anne Tagonist – or at her current site, More Crows Than Eagles, Anne Amnesia. Over time, I’ve been struck by the breadth of her frame of reference, but also by her willingness to puzzle through a question, sharing her uncertainties. Lately, I had noticed her picking up on posts from Tom James and Charlotte Du Cann on the Dark Mountain blog, so I got in touch to propose an interview. Thinking back over the decade since I stumbled across the collapse blog scene, and knowing that Anne has been around it longer, I started by wondering what shifts she had noticed – though, as she pointed out, the timeframe we are talking about is rather a short one.

    DH: I’m curious how you found your way to this corner of the internet in the first place – and how you’ve seen it change, between then and now.

    AA: I wanted to start by saying something like, ‘Growing up, I would reread Walter Miller’s Canticle for Liebowitz every autumn,’ or some such deep explanation. But anxiety about the long-term stability of a complex and interdependent lifestyle has been part of western culture since at least Tacitus. Have you read the Germania? It’s pretty clear that Tacitus was the late Roman Jim Kunstler: it’s all about what intolerant hard-asses the Germans are, completely unlike the effete and decadent Romans, and how this makes them a much more honourable people. It seems to me to be a document of a sort of self-doubt in the heart of what was still, in 98CE, the most powerful empire on the planet, worrying that too much ‘civilisation’ had sapped Roman vitality and – it’s clearly gendered in the text – manliness.

    I should note that by the time the Vandals sacked Rome three hundred years later, they had become an intercontinental trade empire with bigger cities than Tacitus had even seen, so it’s up to interpretation whether he was ‘proven right’.

    Anyway, the point is, people have worried that they were getting ‘too civilised’ and were risking some sort of collapse, if the interdependence implicit in cosmopolitan society were disrupted, for a very long time. Tracking my own interest is really only the story of my own life, which is why Walter Miller isn’t a bad place to start. Worth noting, though, is that I also grew up with a ‘nuclear air-raid drill’ every Wednesday at noon in my hometown. It wasn’t the much-mocked duck-and-cover, just a test to make sure the siren still worked, but the reminder of an existential threat was just as piercing as it would have been in the sixties. The library stocked civil defence pamphlets, if you wanted to know how to make a hand-powered ventilator for your underground shelter that would screen out all but the smallest fallout particles. It was just something that you learned to accept.

    Canticle is a three-part story, set at different stages of future history after a nuclear war. Unlike most of the global disaster science fiction, it was less about conflict between the ‘good’ survivors and the ‘bad’ and more about how to restore a sense of dignity and meaning to a human race that now live undignified, hard-scrabble, starvation-plagued, miserable lives, and, worse, has brought this condition on itself through hubris and self-destruction. It’s a very Catholic book, obviously: how do you exalt the fallen?

    I think that’s still the contradiction that runs through my own writing: we are prone to horrible rages and self-destruction, and yet we are still beautiful and imaginative and worth loving. We have fucked up pretty much every part of our ecosystem worth up-fucking, and yet we are the only species we have the option of being. We are both fragile and indestructible, stupid and self-sacrificing, fear-shot and able to party in the ruins of any catastrophe. What do you even do about that?

    I was a zine writer and a traveller, and somebody swapped me a copy of Evil Twin’s Not For Rent just before the US got its first road occupation, in Minnehaha, Minneapolis. So of course I had to go. It was back just before the WTO protests in Seattle, and John Zerzan and Live Wild or Diewere the big deal on the radical-eco scene, so that was the world I lived in. The Y2K bug pushed a lot of people to imagine what would happen without computers, and living as we did in an extremely improvisational and DIY sort of way, it was hard to see that kind of collapse as a bad thing, especially because it would probably bring an end to the paving and deforestation that we were fighting. I developed this sort of double-vision where I would see things as they ‘were’ and superimposed I would see them as they would be after a century of abandonment and neglect. I can’t explain this, it still comes over me sometimes; sometimes it’s beautiful and sometimes it’s terrifying.

    I had a tense relationship with the anti-civilisation kids because, on the one hand, I helped organise a few gatherings, while on the other hand, I thought most of them were nuts. I took a road trip with one kid who kept trying to convince me that being polite to strangers was a crippling weakness induced by civilisation, and so this person was intentionally rude, all the time. There were also a few professional gadfly types who showed up, for instance, to a collective in Texas where I was living and spent a week on the couch denouncing things. The positive ‘rewilding’ lifestyle hadn’t really come together and there was no way for people to live that felt like they weren’t betraying themselves constantly, so of course they were very grouchy all the time.

    I became fairly disenchanted with the anti-civ/collapse idea in Texas, but two things happened in rapid succession to change my outlook. First, in the July 2005 issue of The Atlantic, James Fallows wrote ‘Countdown to a Meltdown’, an article that pulled together peak oil, the housing bubble (still largely unrecognised) and a pending currency crisis that doesn’t seem to have come true. This was the most significant and official acknowledgement of the instability of ‘progress’ that I’d come across that was immediate and concrete, rather than just being a hand-wavy extension of cultural anxiety and disguised Cold War nuclear horror. It put the possibility of serious changes in the ‘American Way of Life’ back on the table.

    The second event was Hurricane Katrina. After joking sarcastically on my then blog that anti-civ kids should all go down to Katrina and see what life was like without the infrastructure of a functional community, I watched several people, friends and unknowns, go do exactly that and turn what was probably the biggest disaster in the US in my lifetime into a brilliant experiment in post-collapse living. The punk/squatter scene in New Orleans before the storm was more notable for axe fights and really bad drunkenness and suddenly, with a shared purpose, those same kids started improvising electrification schemes for storm-damaged neighbourhoods and building out community-run infrastructure. It made me rethink my cynicism about anti-civ activism. I now think that in the absence of civilisation, most people – perhaps not anti-civ activists, but that’s another story – would be helping improve the lot of strangers and would recreate the best parts of a society to the best of their ability.

    Seeing this happen mellowed me a lot to the new generation of collapse writers, at least to those who weren’t daydreaming about shooting their neighbours at the first sign of currency depreciation. I found Ran Prieur’s blog when another collective I was living in, this time in Philadelphia, was looking into growing edible algae on the roof. He had written about a book that he’d been unable to track down on microalgae, and my college library had a copy. So I read it, sent him a review, and we’ve been trading emails ever since. Having Ran and his readership to interact with made me more comfortable with writing collapse-y articles for a general audience. 

    DH: Reading this, I got a string of flashbacks. To being eight years old and finding out about nuclear weapons and wondering why all the adults seemed to be going on with their lives, acting normal, as if this wasn’t the most important and horrifying thing ever. (I wonder if kids still have that now, when they learn about the missiles in their silos, or if it was specific to the Cold War and the explicit promise of Mutually Assured Destruction?) And then to being twenty and reading about Y2K, the sensation of vertigo, that the world as we know it might just end. As I turned this possibility over in my head, wondering what to do with it, another thought came – that everything is going to end, sooner or later, anyway, from one chain of events or another. I remember this as a weirdly comforting realisation.

    The way you describe your trajectory, it sounds like you came across the anti-civilisation and collapse stuff within a whole context of zines, activism, punk, DIY culture. On this side of the Atlantic, it seemed to be less grounded, something I came across late at night on blogs written by people I’d probably never meet, or through books by Zerzan and Jensen. When Paul and I wrote the manifesto and called it Uncivilisation, we were thinking on slightly different lines – not so much ‘anti-’ some big other of civilisation, more how do we start disentangling our thoughts and hopes from all these illusions and grand narratives – but one effect was to create a meeting point where people who were trying to figure this stuff out could find each other, including getting together around festival campfires and having conversations that there didn’t seem to be room for in the activist spaces that a lot of us had been involved in.

    One thing that’s troubled me over time, and you touched on this already with Tacitus and the Germans, is how male and white and straight the perspective of most of the writing about collapse tends to be. Not that this is in any way different to the perspectives that get most of the attention on almost any other subject, but still, it distorts the way this stuff gets talked about. I remember Vinay Gupta coming to the first Dark Mountain festival, looking out at this very white audience, and saying, ‘What you people call collapse is living in the same conditions as the people who grow your coffee.’ And I’m thinking of a post of yours about ‘Poverty, Wealth and the Future’, where you wrote, ‘Your meditation for today is this: Who would you be in a refugee camp?’ I keep coming back to that post.

    So I wonder if you could say some more about the way those distortions shape the tropes of collapse writing, and what it might mean to ground our thinking in a broader range of human experience?

    AA: Well, it’s interesting to read your question about the persistence of the nuclear threat on the same day that I learn about a ‘mishap’ with an intercontinental ballistic missile in May of 2014. I’m not sure it still has the same emotional resonance that it did in my own childhood, but we’re stuck for the foreseeable future living in a world where a dropped wrench can potentially end the planet. I’d be interested to ask young people how often they worry about this sort of thing – it’s probably important that in my childhood (and yours, it sounds like) nuclear annihilation would have been a deliberate action by a hostile agency, rather than a stupid mistake. I think we have an easier time psychologically facing down an opponent, no matter how implacable or even incomprehensible, than we do confronting random chance and tragic error.

    You’re definitely right that my own development as a ‘collapse’ writer happened socially, rather than in isolation, but I don’t know whether that’s common or not. The internet would suggest that the typical apocalypse fan came by their beliefs consuming media alone, and in fact tries to hide their ‘preparation’ from their friends and neighbours, either because they don’t want to be besieged by hungry zombies when the shit hits the fan or because they’re embarrassed to subscribe to a philosophy that treats their nearest and dearest as a zombie-level liability in need of murdering some time in the near future. Then again, isolated angry people tend to make up a disproportionately large part of any internet conversation, so I’m not sure whether this is actually a relevant metric.

    I think this gets to what you’re saying about one person’s collapse being someone else’s daily life. There’s a meme that crops up whenever there’s a major disaster anywhere in the world: American journalists report back from the scene, bewildered at the ‘resilience’ of the local population, who are struggling onward, getting out of bed and tending to things on schedule, despite the horror that has befallen them. Having seen disasters in the first and third worlds, I am most taken by the aspect of surprise. It’s as if the reporters assume that the natural response to an earthquake, or a coup, or an outbreak of a terrible disease is to run in circles fluttering your hands until you are too exhausted to continue, or worse yet to kill yourself.

    For a long time I thought this was just the idiocy of the American journo narrative, but since Superstorm Sandy, we’ve seen several crises hit not just the US, but relatively affluent parts of the US. Surprisingly I have seen people quivering before the cameras, seemingly helpless, asking, as if the amassed TV audience were able to answer, what they are supposed to do? It struck me that this is actually not a disaster response, it’s a fear response – it’s the sort of thing people do, not when their way of life has been completely overwhelmed and they have an extended to-do list just to get through the day, but when they feel that way of life is deeply and permanently at risk, and there’s (as yet) nothing they can actually do about it. When you live in a crappy corrugated iron shed, next to a thousand other crappy corrugated iron sheds, and you walk two miles each morning before dawn to pick coffee, a flood is a crisis that demands attention, and your whole mind is full of getting the kids to high ground, waiting at line for the satellite phone to call your cousin in the city, and otherwise managing your situation. Freaking out would be counterproductive and as a rational, caring human being, you don’t. When you have a mortgaged, expensive, oversized house with flood insurance and a national guard that will make sure your family won’t die, when you have a credit card to stay at any cheap hotel in the state, when your daily social status concern is whether people like your posts on Facebook, you have the luxury to freak out, even if it is pointless to do so.

    I think collapse writing, when it isn’t pathological anger, is something like a luxury freak-out. It’s about a looming fear of shame, of loss of status, loss of comfort, a return of the repressed hardships we know are entailed by our lifestyles, if not by life itself, but which we generally manage to outsource to someone else. I think this is why so many collapse narratives are moral in nature – there’s a sense that we’ve got it coming, but so far we can continue to work at our near-sinecure jobs, worry about our near-frivolous social status variations, argue about increasingly minute differences in consumption patterns (organic? paleo?) and otherwise ignore our insecurities. There’s no pile of cinderblock to be dismantled, no giant slop of mud in the cowshed, no long road to safety that we can walk, nothing, that is, to ameliorate our situation – it is as fixed and inaccessible as history itself, which is why it often seems like something that’s already happened.

    As to your question about whiteness, I’d come at it from a slightly different angle. Especially since the anti-colonial renaissance of the sixties, people from the Global South have been writing passionately of their experiences losing their social structures, their way of life, their sense of self when an aspect of the natural world, or the political superstructure, or some seemingly insignificant part of ‘the economy’ rolls over and takes a crap. Why are collapse bloggers not reading these, or at least not incorporating them into their understanding of the future? Why are the experiences of the third world not considered collapse writing? Why, when you see a reference to a kinship society torn to pieces by a land-grabbing human-enslaving culture-ignoring ‘civilisation’ storming in with guns, is it always Braveheart and not, say, Chinua Achebe? Where, in all the paranoia about Ebola a few years back, was any acknowledgement of slave trade or the ivory trade or the rubber trade or any of the long history of civilisation-ending catastrophes in that exact region?

    I would proffer two unflattering analyses. First, it is hard for western white people to identify with people of colour, especially poor people of colour, especially poor people of colour living outside what we think of as the west. That’s a disgrace and perhaps a damnation upon us, but it’s a bigger problem than the collapsosphere. Secondly, I think the stories from The Everybody Else don’t contain that vertiginous loss of status, that plunge from the precipice into the subaltern, and hence aren’t readily associated with the same frisson of hand-fluttering panic. These two analyses are connected of course – much of the social status in the world seems insignificant to the very wealthy, hence the popular reading of The Boy Who Harnessed the Wind as a tale of a third-world startup entrepreneur. The idea that a family who grows corn, sells cattle, and has a market stall would be the absolute last family who should be starving seems unfamiliar to a western culture where farmers, at least farmers who did not grow up on organic box cereal in Brooklyn, are assumed to be ignorant and dangerous, so readers don’t pick up on exactly how socially fraught, even shameful, the Kamkwambas’ plight actually is.

    Exceptions worth mentioning are Delaney’s Dhalgren and Octavia Butler’s Parable of the Sower which are both by and about black Americans. Underrated, I think, both of them, but still reasonably well incorporated into the collapse bibliography. Also, several science fiction books and articles about the third world, written by first-worlders, have achieved some celebrity. I’m thinking of Gardiner Harris’ work on sea-level rise and slavery in Bangladesh, Margaret Atwood’s backstory to the character Oryx in Oryx and Crake, Ian McDonald’s Chaga Sagaand Brasyl, Paolo Bacigalupi’s entire oeuvre. Still, these are observations and fictional projections, not first-hand experience, and they certainly shouldn’t substitute for the (existing) shelves full of analysis on social disruption written by subaltern populations themselves, so Vinay’s point stands.

    I imagine that as more people are directly affected by climate change, in the west as well as elsewhere, first-world collapse writing – at least first-person first-world collapse writing – will become more measured, realistic, and task-saturated, plaintive in its appeals to justice but competent in its approach to the immediate future; in other words, more like the last century of third-world collapse writing. Whether this displaces status-shame-panic as a genre, or whether ‘collapse’ will continue to refer only to the terrors of the shrinking pool of people who have not yet been directly impacted by changes in the world remains to be seen.

    DH: What you say about the luxury of freaking out makes me think of a conversation in the Dark Mountain Workshop here in Sweden. Two of the artists in the workshop have been studying with the same teacher, an autistic woman who had to develop her own models in order to understand the reality which people around her take for granted. These models start from the distinction between ‘the primary’, the reality of life itself, and ‘the secondary’, the reality of values, culture, a particular way of living. We need both of these, she says, but there is a danger if we treat the secondary as primary. There’s a particular ugliness, it seems to me, when we start defending our way of life as if our lives themselves were at stake. We’re seeing this in Europe, just now, as people whose lives actually are at stake – at home in Syria, on the wretched rubber dinghies crossing the Mediterranean – are greeted as a threat to our way of life, so that the Greek government is being urged by ministers from elsewhere in Europe to start sinking the boats.

    Thinking about this collapsosphere that we’ve been talking about, there’s a similar confusion that runs down the middle of it. A lot of folks who are drawn to these online conversations are certainly ‘apocalypse fans’ – people for whom ‘the end of the world as we know it’ means zombies and cannibalism and fortified compounds, who read Cormac McCarthy’s The Roadas a guide to the future. But the other side – which I guess is the reason I find myself coming back, despite the ambivalence that we’ve touched on in this discussion – is that this can sometimes be a space in which people are developing a language in which to talk about the difference between ‘the end of the world as we know it’ and ‘the end of the world, full stop’. That feels interesting, maybe even useful.

    One more question, before we wrap this up. Anyone who has followed your blogs will pick up that your work is related to medicine. The conversations around the collapsosphere are often intertwined with some kind of critique of progress – and the achievements of medicine and public health are often used as a trump card by those defending the idea of progress. So I’m curious what reflections you might have on how these things fit together?

    AA: I worked as a ‘street medic’, then on an ambulance, and then I became a medical student before shifting gears to become a researcher. The idea that medicine and public health are ‘trump cards’ for progress is only relevant if you believe that progress (or not) is a decision being made by rational people on the basis of evidence, which it isn’t. If it were, I’d choose progress which allows everyone on earth to enjoy the actual benefits of a modern western standard of living (not losing half your kids to dysentery, seawalls, language schools and construction methods that acknowledge the inevitability of fires and earthquakes), low- to zero-consumption technology, rewilding of land, informed stewardship of agricultural areas, and continued best practices in medicine and public health –and avoids wasting any resources on the bogus fake benefits (candy, cheap long distance travel, extra clothing, jet skis, Facebook). But progress doesn’t work that way. I’m not the queen of the world and don’t want to be; I’d imagine pretty much everybody on the planet could make their own lists of which developments count as progress and which don’t; and I doubt there’d ever be any consensus. Plus, when you think about what it would take to actually create and enforce ‘rational progress’, you realise it isn’t going to happen.

    That said, hell yes – you can keep your ‘man walks on moon’, I say the high point of collective human achievement is the elimination of smallpox. We’re less than five years from eradicating polio and guinea worm too.

    DH: Yes, I guess where I was going with the ‘trump card’ thing is that, if you try bringing the idea of progress into question, pretty soon you will be met with an argument along the lines of, ‘Are you saying smallpox eradication, aspirin and the reduction in infant mortality aren’t real, good and important developments?’ And it’s hard to imagine a sane standpoint from which to deny the reality, goodness or importance of these things. So it seems like a fairly unanswerable argument that progress exists as an objective historical phenomenon – even if it is a phenomenon that’s vulnerable to setbacks and reverses, not an inevitable one-way force. Against this, I’d want to say that it may be wiser to attend to the specific, to celebrate these good things for what they are – whereas once we start talking in terms of progress, however carefully, this tends to become a generalised idea under the heading of which things we would want to celebrate and things we might want to question get bundled together. This is one of the senses in which progress is not ‘a decision being made by rational people’, as you say, but an intensely-charged narrative with a particular cultural history.

    AA: Ran talks about this phenomenon – that people lack the imagination to picture a world other than one we’ve already lived through, so if you don’t like The Now, you must be arguing for some time in the past with all its warts and insufficiencies. There are two ways to approach this, I think: one is from an engineering perspective, explaining that just because we won’t have ubiquitous cheap energy at some point in the near future, or just because our current patterns of settlement are unlikely to persist, that doesn’t mean we’re going to forget everything we know about waterborne diseases. This is a difficult argument to make, though, because I am not a water treatment engineer and chances are neither is whoever I’m talking to, so it’s hard to avoid making wild and inaccurate claims about technology and technê

    The other approach is one that I’ve seen used to refute eugenics – imagine a future historian looking back at the early twenty-first century. Would they, in all likelihood, have any more respect for us today than we have for open-pit lead smelting and leech-doctors? Whoever comes after us will probably look at us as flawed, short-sighted and blind to the effects of our actions, just as we see our predecessors in hindsight. Their successors will view them the same way. Even if they do re-adopt older technology, or forget some aspects of ‘progress’ that are unblemished boons to humanity, they’ll still see us as morons, just as we can lament the passing of the Stick Chart or the ability to recite the Odyssey from memory, while being grateful that we don’t actually live in those worlds.


    First published in Dark Mountain: Issue 9.

  • Expectations of Life & Death

    The days of our years are threescore years and ten; and if by reason of strength they be fourscore years, yet is their strength labour and sorrow; for it is soon cut off, and we fly away.

    Psalm 90:10

    What it means to grow old has changed enormously, within a handful of generations, yet not in the way that we tend to assume.

    The headline figures are startling: no country in the world today has a lower life expectancy than the countries where life expectancy was highest in 1800. A baby born that year in Sweden could expect to live to the age of 32; a descendent of that baby, born in the same country today, can expect to live to 82.

    What is commonly misunderstood is the nature of the change behind these figures. They seem to suggest a world in which to reach your early thirties was to be old, in the way that someone in their early eighties would be thought of as old today; a world in which life was truly ‘nasty, brutish and short’. Yet the reality is that the age at which a person is thought of as old has changed relatively little from century to century, even as far back as biblical times, when the psalmist could lament the brevity of human life which stretches to 70 or 80 years. What is different today is that living to grow old has become a reasonable expectation, something we can almost take for granted, rather than a matter of luck.

    The reason for clarifying this distinction is not to downplay the extraordinary achievement represented by the increase in life expectancy at birth, but to seek to understand it better. This matters, not least, if we want to think clearly about the promises and claims being made today in the name of life extension. To do so, we need a subtler feel for statistics and also for the cultural assumptions that shape our understanding of death.

    Among the contradictory tendencies that make up our culture, there is a habit of treating the fruits of measurement and calculation as revealing an underlying reality that is ‘truer’ than the deceptive evidence of our senses. It may be more helpful to think of the results of quantitative labour as the traces left by reality: footprints in the sand, clues in need of interpretation.

    If the figures of life expectancy at birth are one set of footprints left by the lives our ancestors led, another trail of clues is found in the measure of the modal age at death. This tells us at what age it has been most common to die, a slightly different question to the average length of life, and one that takes us closer to the experience of growing old in a particular time and place.

    In England, reliable records don’t stretch back quite as far as they do in Sweden, but it is possible to pick up the trail in 1841, when life expectancy at birth was a little over 40. In the same year, the modal age of death was 77 for women and 70 for men.

    Over the following century and a half, these ages would go up to 88 and 85, respectively: a significant increase, but not of the same order as seen in the more commonly cited figures for life expectancy.

    What is going on here? Why do these two ways of tracing the changing patterns of death tell such different stories? Part of the answer is that the figures for modal age at death ignore all deaths before the age of 10. Until relatively recently, the age at which it was most common to die was zero: a significant proportion of those born never made it past the first weeks and months of life. The decline in infant mortality is not the only factor in the changing of our expectations of life and death, but it is a large one, and it separates the world in which we now live from the world as our ancestors knew it.

    What grounds could there be for leaving aside the great swathes of death in infancy and early childhood? Clearly, they must be part of any attempt to form a picture of what age and dying have meant through time, but there are reasons for treating them separately from death in adult life. The first is that it is their inclusion in the averages of life expectancy which creates the misleading impression of a world in which old age began in one’s early thirties. The second is that the causes of death in infancy are different to the causes of death in adult life.

    Broadly speaking, it makes sense to think of a human life as falling into three phases: the vulnerability of the first years gives way to the strength of adulthood, then after five or six decades, this strength gives way in turn to the frailty of age. In each of these stages, we are less likely to die in a given year than were our ancestors, but the things that are likely to kill us are different and so are the factors that increase our chances of survival.

    Along with the idea that our ancestors could expect to die in their thirties, perhaps the most common misconception about the changing nature of age and death is that it is the result of advancements in medicine. While medical technologies and interventions have played a part, it is not the leading one. Of the 30 years increase in life expectancy that took place in the United States during the 20th century, only five years could be attributed to medical care: the remaining 25 years were the result of improvements in public health.

    This is good news. Compared to medical procedures and drug treatment programmes, public health measures tend to be cheaper and therefore reach those who do not have access to highly-trained medical staff. What is more, while medical treatments frequently come with negative side-effects, improvements in public health tend to correspond to broader improvements in quality of life for the individual and society. A recent project in the north-east of England saw the National Health Service paying to insulate the homes of people with chronic health conditions, a move which could be justified in terms of the savings from reduced hospital admissions among the group.

    The benefits of clean water and sanitation are particularly important to increasing the chances of survival in the vulnerable first years, whereas the benefits of advanced medical treatments are more likely to add years to the end of our lives. The importance of public health explains why increases in life expectancy have spread far beyond the reach of highly-equipped hospitals. The most striking example is the Indian state of Kerala, where the average income is three dollars a day, yet life expectancy and infant mortality rates are close to those of Europe and the United States.

    Such examples matter because they can bring into question the ways in which the future is usually framed. Among these is the tendency to present it as a choice: either we find a way to sustain and extend the way of life taken for granted by roughly one in seven of the people currently alive, with its technological and economic intensity, or we lose this way of life and fall into a Hobbesian nightmare. The Kerala story is complex, but among other things it is a clue that there are more futures available than we are often encouraged to think about.

    Death is a biological reality, a hard fact that lies in front of all of us. It is also deeply cultural, entangled with and inseparable from the stories we tell about ourselves, the world and our place within it.

    In the 1960s, the sociologists B.G. Glasser and A.L. Strauss identified two contrasting attitudes to death in American hospitals. Among one set of families, mostly recent immigrants, the approach of death was time to leave the hospital so that one could have the dignity of dying at home according to custom; for another group — those ‘more involved in modernity’, as the historian Philippe Ariés puts it — the hospital has become the place where you come to die, because death at home has become inconvenient. Much could be said about these two attitudes, those who ‘check out’ to die and those who ‘check in’, but it is hard to reduce them to a simple trajectory of historical progress in which the modern approach renders the older traditions conclusively obsolete.

    Life expectancy — and death expectancy, for that matter — is good ground from which to think about the ideology of progress. It is hard to imagine anyone who would dispute that the improved life chances of the newborn represent an unqualified good. And at this point, I must disown any pretence at detachment: as I write this, I am thinking of my son, who was born nine weeks ago. I can be nothing other than thankful at the good fortune that he was born into a world — and into a part of the world — where childbirth no longer carries a significant likelihood of death for mother or baby, and where the conditions, the knowledge and the facilities are present such that we can almost take it for granted that he will make it through the vulnerable first months and years of life.

    Having acknowledged this, what else could there be to say? Except that, as we have already seen, when the great changes in infant mortality are compounded into a single vector of improvement in life expectancy, the result tends to give us a misleading picture of the relationship between our lives and the lives of our ancestors. In the same way, the problem with the ideology of progress is that it requires the reduction of the complex patterns of change from generation to generation into a single vector of improvement, and the result is similarly misleading.

    This may come into focus, if we begin to think about life extension, a proposition around which bold predictions and promises are currently made. Those who foresee a future in which human life is measured in centuries rather than decades often appeal to the historical statistics of life expectancy, as if the offer they are making is a natural extension of a process that has already been under way for generations.

    Yet, as we have seen, this is based on a misunderstanding of what lies behind those statistics. 80 is not the new 30 — and if someone wishes to convince us that 200 will be the new 80, they cannot call on trends in historical life expectancy as evidence for this.

    In fact, it is not clear that the possible duration of human life has been extended. The existence of an upper limit to the human lifespan is a matter of dispute among those who study this area. (Those who study human bodies seem to be more inclined to believe in such a limit than those who study statistics.) It is true that there has been an upward movement in the age of the oldest attestable human over the past two centuries, with the record held by Jeanne Calment, who died in France in 1997 at the age of 122.

    However, while Calment’s case is considered exemplary in terms of the documentary proof available, attesting the age of the extremely old remains difficult in many parts of the world, even today, and in earlier historical periods, absence of evidence cannot simply be taken as evidence of absence.

    What can be said more confidently is that almost all of the increase in longevity that we now take for granted consists of a shift in the distribution of death within historically-known limits. It has not been unusual for some individuals within a community to live into their late 80s; what is new is that living into one’s late 80s is becoming the norm in many societies.

    Changes in infant mortality may represent an unqualified good, but when the strength of adulthood gives way to the frailty of age, the changes in what we can expect may be more open to dispute.

    To generations of doctors, pneumonia was known as ‘the old man’s friend’, a condition that tends to leave the healthy untouched, but offers a relatively peaceful death for those who are already weakened. This expression reflects the idea that there is such a thing as a time to die, rather than the role of medicine being always to sustain life at all costs. Today, pneumonia in the very old is fought with antibiotics. Meanwhile, 40% of those aged 85 or over are living with dementia. Our culture can still talk about an ‘untimely death’, but the idea that death is sometimes timely is harder for us to acknowledge. To anyone who has watched a person they love pass into the shadow of Alzheimer’s disease, the question can arise, whether there is indeed a time to die — and whether our way of living increasingly means that we miss that time, living on in a state that is neither life nor death.

    To such thoughts, the answer will come: we are investing great amounts of money and talent in the search for a cure to Alzheimer’s.

    And, for that matter, in the search of a cure for ageing and a cure for death.

    If I were to claim that these goals are unattainable, I would be exceeding the bounds of my knowledge. Instead, to those who seek them, I would make two suggestions.

    First, as I have tried to show, the search for life extension is not the natural continuation of the trends that have led to increased life expectancy over the last handful of generations. The bulk of the achievements in life expectancy have been the result of public health improvements, rather than high-tech medicine, and their overall effect has been to increase the likelihood of growing old, rather than change the definition of what it is to have grown old.

    Secondly, it seems to me that the pursuit of vastly longer human lifetimes is itself a culturally-peculiar goal. To see it as desirable to live forever is to have a particular understanding of what it is to be a person: to place oneself at the centre of the universe, rather than to see oneself as part of a chain of generations.

    When I look at my son, I feel gratitude for the chance at life that he has. I hope to live to see him grow strong and take the place that is mine today, as I learn how to grow old and take the place which is now my parents’. And I hope that he will outlive me.

    I know that there is much that he and I can almost take for granted, just now, that our ancestors could not. Yet I suspect that my hopes are not so different to theirs, and as I hold him and look into his new face, I understand myself more clearly as a small part within something vastly larger.


    First published by Mooria magazine.