Sunshine Recorder

Link: Diana Athill: It’s Silly to Be Frightened of Being Dead

At the age of 96, the legendary editor Diana Athill writes, the idea of death has never been less alarming. The process of dying is another matter.

Back in the 1920s my mother never went to a funeral if she could help it, and was horrified when she heard of children being exposed to such an ordeal, and my father vanished from the room if death was mentioned; very much later, in the 1960s, when the publishers in which I was a partner brought out a beautiful and amusing book about the trappings of death, booksellers refused to stock something so “morbid”. I was born in December 1917, so was fully immersed in this refusal to contemplate death. Indeed it was not until more than 30 years later, when I had to visit a coroner’s office to identify a woman who had been found dead, that I thought for the first time how extraordinary – indeed how ridiculous – it was to have lived for so long without ever having seen a dead body. I have heard it suggested that this recoil from the subject was a result of the first world war filling everyone’s minds with an acute and appalled awareness of death, but my own explanation was, and still is, that it was a pendulum-swing away from the preceding century’s obsession with the subject – the relish for mourning, ranging from solemn viewing of the corpse by young and old alike, to passionate concern about the exact degree of blackness to be worn, and for how long (for the rest of your days if you were a widow). A mood so extreme surely had to result in a strong reaction.

It seems to me that what influences the consciousness in wartime is not death. It is killing. And no, they are not the same thing.

Death is the inevitable end of an individual object’s existence – I don’t say “end of life” because it is a part of life. Everything begins, develops – if animal or vegetable, breeds – then fades away: everything, not just humans, animals, plants, but things which seem to us eternal, such as rocks. Mountains wear down from jagged peaks to flatness. Even planets decay. That natural process is death. Killing is the obscene intervention of violence, the violation which prevents a human being or any other animal from reaching death as it should be reached. Killing certainly did affect the minds of those exposed to the first world war. It shocked most of them into silence: many of the men who survived fighting in it never spoke of it, and I think it had the same effect on most of those the men returned to. It was too dreadful. They shut down on it.


My maternal grandparents’ house, in which the children of my generation spent all their holidays, and where we stayed if our empire-serving parents were abroad in some place inhospitable to the young, was typical of those times in that the only music-making objects in it were an upright piano and a small wind-up record player that had belonged to my uncle when he was a boy: a condition probably unthinkable to children today. There was no pop music because there were no teenagers, only children and grownups. Certainly once the children had turned 12 they began being restive (the grownups called it “the awkward age”) but there was little to be done about it. There were music-hall songs and dance music, but they could only come into a home via sheet music and if there was someone there who could play the piano, and the limit of adult piano-playing in our family was nursery rhymes to amuse the little ones. A hint of the future might have been detected in the eagerness with which we children fell on Uncle Billy’s little “gramaphone”, which had been forgotten by the grownups. We listened over and over again to the few records that went with it – some Gilbert and Sullivan songs and two or three spirituals sung by Paul Robeson. Right at the back of the cupboard where they lived I once found another record, which turned out to be a wartime song, a comic and rather witty version of Who Killed Cock Robin called Who Killed Bill Kaiser. Although I was born before the war’s end, it was as remote and unreal to me as the wars of the roses, so I was as thrilled as I would have been if I had dug up a medieval helmet, and ran to show the record to my mother. All she said was, “That old thing – is it still there?” It was a shock to come up so suddenly against the fact that what to me was history, to her was just something from the day before yesterday. Absolutely no trace of that day before yesterday had been injected into my consciousness by my elders, so whatever I was to feel about death, it had nothing to do with war.

My own experience of the second world war confirmed this. Before it started, during the horrible months when we could all feel it coming, I said to a friend: “If it does start I think I’ll kill myself.” (Although the preceding war had been little talked about, poets and novelists had written about it, so we were fully aware that a repetition ought to be unthinkable.) My friend replied, “Killing yourself to avoid being killed would be a bit silly,” and I felt sadly that she was being obtuse. It was not the prospect of being killed that was distressing me, it was having to know this obscenity about life. And that, not fear of death, was what polluted one’s consciousness all through the war, so that the moment it was over we too shut down on it.

Because we did shut down. “It’s over!” That knowledge wiped out any other feeling. Although I have never doubted (heaven knows why) that we were going to win the abomination, there had been times when I had not thought – perhaps “thought” is wrong – when I had not felt it possible that it would ever end. In one’s twenties a year is a very long time, and there had been so many of them. It astounds me now when I hear or read people describing the 1950s as dreary, because to me they were wonderful. What did it matter that rationing dragged on? We were getting more for our coupons every day – it was slow of course, but how could it be otherwise after what we had been through? Now we had our new Labour government, we had the National Health Service (how can anyone forget what a miracle that was?), we had Dior’s ravishing New Look, we could travel again and who cared if we could take no more than £25 with us when it was so amazing what one could do on £25 in France or Italy or Greece. I could see no reason to be anything but happy, and death was just something that would occur when I was old – and which was not, and never had been, frightening.

That this was true, I owe to Montaigne. I can’t remember when I read, or was told, that he considered it a good thing to spend a short time every day thinking about death, thus getting used to its inevitability and coming to understand that something inevitable is natural and can’t be too bad, but it was in my early teens, and it struck me as a sensible idea. Of course I didn’t set out to think about death in a regular way every day, but I did think about it quite often, and sure enough, it worked. Why coming to see death’s naturalness should have caused belief in an afterlife to melt away, I am unsure, but it did. Probably that belief had been no more than an unexamined acceptance of something said by a grownup: in a child’s life there are many things more important to question than the probability of reuniting after death with other dead people – ideas that are tucked away on a back shelf of the mind like some object for which one has no use at present.


When I was 16, I had my appendix removed: an operation common in those days which seems to have gone out of fashion. Going under the anaesthetic, which was chloroform, caused an interesting confrontation with that particular idea. As a little girl I had occasionally suspected that there was a monster under my bed waiting to come out and get me, scaring myself so much that I had to be calmed down and assured that I was imagining it. Presumably the anaesthetist preparing me for the operation diminished the flow of chloroform too soon, because I became conscious, without the least idea of where I was or what was happening: all I knew was that I was lying on my back, on a bed, with a stifling claw clamped down on my face. They had been lying! The monster had been there all the time and now it had come out and got me, I was dying! The dying felt like tipping over the edge of a cliff into black nothingness. I was hanging desperately on to the rim of the cliff. I was staring into that black nothingness – and horror of horrors, understood that it was not nothingness: there were shapes swimming about, things happening, creatures at large out there, and I was about to be pitched in among all that, unprepared, ignorant, totally incapable of coping. It was terrifying – surely one was supposed to change in some way at death, but I was still unchanged, still just my miserably inadequate self. Into my mind there came the thought, “If I start to believe in God perhaps I’ll be allowed to change so that I will know what to do?” At which – and I’m still proud of this – I answered myself: “No! That would be too shameful, just because I’m frightened.” I let go, and down into the black nothingness I slid.

So, when many years later I really was near death as a result of a haemorrhage after a miscarriage, and heard a doctor saying “She’s very near collapse – call the lab and tell him to run” and understood that the “him” was the person fetching the blood they were going to pump into me, I was not in the least alarmed as I dimly wondered if I had the strength left to think some suitable Last Thought, concluded that I hadn’t, and said to myself the words: “Oh well, if I die I die.” I was sure, then, that nothingness was just that.

I live now in an old people’s home with 42 others, our average age being 90, or perhaps a little more. When one makes the difficult decision (and difficult it is) to retire from normal life, get rid of one’s home and most of one’s possessions, and move into such a place (or be moved, which doesn’t apply here I am glad to say) it means that one has reached the stage of thinking, “How am I going to manage my increasing incompetence now that I’m so old? Who is going to look after me when I can no longer look after myself?” Death is no longer something in the distance, but might well be encountered any time now.

You might suppose that this would make it more alarming, but judging from what I now see around me, the opposite happens. Being within sight, it has become something for which one ought to prepare. One of the many things I like about my retirement home is the sensible practical attitude towards death that prevails here. You are asked without embarrassment whether you would rather die here or in a hospital, whether you want to be kept alive whatever happens, or would prefer a heart attack, for instance, to be allowed to take its course, and how you wish your body to be disposed of. Though when a death occurs in the home it is treated with the utmost respect, and also with a rather amazing tact in relation to us survivors, so that I doubt if anyone has ever been disturbed by such activity as I suppose surrounds the moment of a death, and the removal of a body: a carefulness of our peace of mind which must involve very well-planned management.

These matters have become discussable with one’s friends, not, of course, as a frequent part of gossip over lunch in the dining room (our only communal occasion) but from time to time, perhaps when admiring someone’s stoicism if their frailty is becoming painfully apparent, or feeling sad at someone else’s inability to accept what seems to be imminent. As a result of this openness, I think that most of the people here would consider it foolish to be frightened of being dead. All of us, however, feel some degree of anxiety about the process of dying.

That process depends on what you are dying of. The body can fail in ways that are extremely distressing, slowly and painfully, demanding much stoicism, or it can switch off with little more than a flash of dizziness. In my family we seem to have been uncommonly lucky in that respect. There was the 82-year-old uncle who was at a meet of the Norwich Stag Hounds, enjoying a drink with friends, when crash! And he fell off his horse, dead. There was the cousin in her eighties who fell dead as she was filling a kettle to make tea, and the other cousin, 98, who slipped away so gently that the sister who was holding her hand didn’t realise that she had stopped breathing. There was my mother, a week before her 96th birthday, who had one nasty day which, to my relief, she couldn’t remember the next morning, then slept her way out after speaking her last words, “It was absolutely divine,” about a recent drive to a beloved place. My father, alas, had a whole week of unhappiness after a blood-vessel in his brain had ruptured. He looked up as one came through the door, obviously about to greet one, then when he found he couldn’t speak, his expression became one of pain and puzzlement: he understood that something was badly amiss but he didn’t know what it was. The moment of his dying, however, was sudden and painless. My brother was the only person near me who clearly resented death, and that was because he had achieved a way of life which suited him so perfectly that he wanted more. He was not frightened of it. “No one after 80 has any right to complain about death,” he had said to me not long before.

That fortunate record makes me believe that although it would be unwise to expect an easy dying, it is not unreasonable to hope for one. As for after it, I feel quite strongly that I would like my ashes to be scattered or buried in a place I love (I scattered my mother’s in her garden – and the old man who tended it for her when she could no longer do it herself said “Cor! That won’t half make the flowers grow”). But such a feeling, though strong, is really absurd, because what does it matter to the dead how their bodies are disposed of? It is for the mourners to do what suits them best.


A little while ago I took part in a television programme about death that was designed by the photographer Rankin, to help him overcome his fear of it, to which he bravely admitted. Whether it served his purpose or not I don’t know – possibly not, because that fear is brewed in the guts, not in the mind – and I remember a man I once knew who suffered from it so badly that he told me he used to wake up in the night and have to telephone his sister and beg her to come round. “What did she do?” I asked, and he said she made tea and talked sense, but it didn’t do much good because the thought of all those bloody silly birds still twittering and those bastards walking up and down the street when he wasn’t there to see them drove him mad. But even if Rankin’s programme failed to make him feel better, which I hope was not the case, it was excellent, and many viewers responded to it with enthusiasm. I had already understood from the response to my own book, Somewhere Towards the End, that the taboo on the subject of death, so heavy in my youth, was evaporating, and this was a striking example of how true this is. Even teenagers joined willingly in discussion of it.

The contributor to the programme I remember with the most pleasure is the man who said that not existing for thousands and thousands of years before his birth had never worried him for a moment, so why should going back into non-existence at this death cause him dismay? Everyone laughed when he said that and so did I, and as I laughed I thought: “Dead right!”

Link: Death and the Maiden

Freud’s theory of the death drive also gives us a way to think about gender.

Walter Benjamin remarked of the people who experienced the First World War:

A generation that had gone to school in horse-drawn streetcars now stood in the open air, amid a landscape in which nothing was the same except the clouds, and at its center, in a forcefield of destructive torrents and explosions, a tiny fragile human body.

What this body could mean was newly in question. Benjamin discusses economic depression, technological innovation, moral uncertainty, and violence, but the First World War also provoked a crisis of masculinity. Men died, were wounded, and later found themselves unemployed in unprecedented numbers. Meanwhile women, as Sarah M Gilbert and Susan Gubar argue in No Man’s Land, “seemed to become, as if by some uncanny swing of history’s pendulum, even more powerful.” Tiny fragile human bodies threatened to detach themselves from their traditionally assigned gender roles. At this historical moment, death collided with gender.

Confronted with a profusion of patients shaken by traumatic dreams in the wake of World War I, Sigmund Freud had a theoretical as well as therapeutic problem. He had previously asserted that every dream is the fulfillment of a wish, but the repetition he encountered in traumatic dreams contradicted this claim. In Beyond the Pleasure Principle (1920) he asked, Why repeat something unpleasurable? Why return to the site of trauma?

To resolve this problem, Freud returned to the mist-enveloped beginnings of life itself. There is a “death instinct” that “was brought into being by the coming to life of inorganic substance,” he wrote. Death is not an event but a state; death is inorganic nature. Life arose from this ­inert primordial condition and its instinct is to return there. Freud is well aware how weird and implausible this sounds, admitting that even he is not convinced by his own eccentric argument.

It may have been that he was trying to resolve two problems at once. Corpses return to inorganic nature, but the mangled war dead returned in dreams. Jagged fragments of memory piercing the flesh of the present, these undead apparitions and the dreamers they haunted were overwhelmingly male. Masculinity smashed to smithereens—torn, limping, fractured, dismembered. Shrapnel embedded in living tissue. When death coexists with life it is not unity but mutilation. Freud looked further back in time than many, but the conservative impulse to restore a previous state of things in the wake of war was widespread.

Female psychoanalysts writing in the interwar years outlined a phenomenon they described as the “masculinization of women.” “We see patients rejecting their female functions,” reported Karen Horney in 1926. Horney likened woman’s resentfully subordinated relationship to men to a worker’s relationship to the boss. She claimed that her female patients often dreamed of castrating their fathers or husbands, simultaneously seeing themselves as “mutilated, injured, or wounded” men. Paradoxically, these women wanted both to destroy and to become men.

Writing on death, Freud does not directly confront the shattering experience of war which forces him to take a peculiar detour into prehistory. He is equally silent on the subject of gender. But his vision of death potentially jeopardizes conventional psychoanalytic understandings of masculinity and femininity—death recognizes no gender distinctions. Freud imagined inorganic nature as prior to life, but his understanding of the death drive is laced with the repressed anxieties about gender that animated interwar discourse. What if his theory is turned on its head? What if inorganic nature, free from gender distinction but now in coexistence with (gendered) life, lay in the present and not in the past? What if the war had killed gender itself stone dead?

Fort/da. Let’s start again.

In the beginning Freud created the heaven and the earth. And the earth was without form and void, and darkness was upon the face of the deep. And the Spirit of Freud moved upon the face of the waters. And Freud said, “Let there be life,” and there was life.

This is how Freud introduces his concept of the death drive inBeyond the Pleasure Principle: “The attributes of life were at some time evoked in inanimate matter by the action of a force whose nature we can form no conception… The tension which then arose in what had hitherto been an inanimate substance endeavored to cancel itself out. In this way the first instinct came into being: the instinct to return to the inanimate state.”

Before life there was death. Freud doesn’t go into the particulars of this lifeless universe. We might imagine the solitary earth spinning through lifeless galaxies—cliff ­faces, chunks of ice, mud flats, stalactites, deserted beaches, barren hillsides, boulders, unmined clusters of twinkling sapphire and ruby, perhaps a river or the occasional pool of lava. Ashes and ashes; dust and dust. Perhaps a black obelisk throbs ominously in the desert. Who knows. Freud certainly doesn’t care. This terrestrial fantasy is already too concrete, dynamic and differentiated. For Freud, the beginning of life is really the beginning of time as such.

When life finally wriggles up from the dirt to inaugurate history it is barely distinguishable from its inorganic surroundings. Freud imagines a tiny cell, “a little fragment of living substance … suspended in the middle of an external world charged with the most powerful energies.” To protect itself from the violent onslaught of the world, this lonely scrap of life forms a protective shield. It wishes to die only its proper, “natural” death and will therefore go to great lengths to avoid perishing at the hands of hostile external forces. To survive attempts on its life, the fragile organism coats itself in a layer of death—“its outermost surface ­ceases to have the structure proper to living matter.” The organic dons the mask of the inorganic. As more complex life-forms evolve, this surface layer is internalized but the primal deathliness remains.

In 1929, Joan Riviere, a British psychoanalyst, described the process by which women who transgressed the confines of gender expectation in the workplace often responded by donning “a mask of womanliness to avert anxiety and retribution from men.” We might align this masked woman to Freud’s tiny cell—to protect itself from the violent onslaught of the world, this lonely scrap of life forms a protective shield—“its outermost surface ceases to have the structure proper to living matter.” The organic dons the mask of the inorganic.

Riviere’s discussion of femininity as masquerade understands gender as semblance. An inorganic costume is required to simulate the supposedly organic gender differences the war had torn to shreds. As the boundaries separating the masculine from the feminine are wearing away in the social realm, they must be more rigidly upheld through the performance of ideal norms. Freud suggests that the inorganic veneer is genderless. But for Riviere, donning a mask of femininity does not eradicate the notion of sexual difference, it consolidates it.

Horney and Riviere cling stubbornly a world carved up into gendered halves—man/woman, feminine/masculine, male/female—the words are repeated and recombined insistently, but to what do they refer? They shuttle wildly between abstract and concrete. At times gender seems to inhere in bodies and at others only adhere to them. Something spills over, refuses to be contained.

By insisting that the binary between masculinity and femininity has a genital correlate, Horney and Riviere are resigned to assisting their patients to function within the prevailing norms of society. Healthy women must come to terms with their lack of a penis, which these psychoanalysts still insist defines them psychologically. But the “masculinized woman” is more explosive than they allow her [him, it, them] to be.

Horney and Riviere still treat gender difference as a point of origin. But Freud looked further back in time. He speculates that in the beginning everything was united. Here there were no gender distinctions—there were no distinctions at all. The first thing the Oxford English Dictionary tells us about the inorganic is that it is “not characterized by having organs or members fitted for special functions.”

The emergence of life represented a violent break with this original unity. “Splintered fragments of living substance” yearned to be whole again. This is where Freud situates the origins of the sexual instincts or Eros, which strive to draw together what the rupture from the inorganic tore apart. Freud considers that the opinion Plato ascribes to Aristophanes in the Symposium might have been correct: bodies were not originally gendered male or female.

But the real insight of Beyond the Pleasure Principle is that death and life are contemporaries. Like a bullet piercing the flesh of the present, inorganic nature has a revolutionary charge – not an uncontaminated then but a hybrid, technologized now. During the interwar years, mass-­produced commodities marketed to a new kind of female ­producer-consumer proliferated—new perfumes with chemical bases that bore no resemblance to the fragrance of flowers, sleek rayon stockings, gaudy lipstick—synthetic masks of womanliness appropriate to an emerging synthetic reality. Life coated in a layer of death. Even at her most “feminine” she [he, it, they] is inorganic.

Read More

Link: Dead Can Dance

Why do we care about the dead? And what does this care keep alive in us? University of California,  Berkeley, historian Thomas Laqueur is working through these questions, focusing on the history of European death cultures. Sprawling and ambitious in scope, his forthcoming book traces the different ways that the dead are put to work to help structure living societies.

The Work of the Dead tells the story of how Europe’s deceased traveled from the churchyard to the out-of-town cemetery via images of colonial power and national unity, and the new uses they were put to in the process. The project gathers a vast quantity of material on the praxis of death, from archaeological evidence of prehistoric burial rites to the modern practice of cremation. This wealth of detail evokes not only the specific individual necessity of mourning—of figuring out what we have lost when we lose someone important to us and how this importance can persist without the presence of its object—but the larger social task of creating histories, genealogies, and stories that organize our relations to one another. In Laqueur’s account, the social as such starts to look like a vast work of mourning. Or maybe it’s better to say that mourning looks like the starting point of the social. Animals know death too, but they don’t make such a habit of it.

His previous books include Making Sex: Body and Gender from the Greeks to Freud (1990) and Solitary Sex: A Cultural History of Masturbation (2003). Like these books, The Work of the Dead investigates the troubled line between nature and culture, and the manifold means by which broadly consistent physical facts become widely differing social realities.

In an interview, we talked about war memorials, Marx’s tomb, the names of the dead, and imperialist fantasies of murdering ghosts.

Can you give me a brief summary of the book you’re working on now?

The project I’m working on is called The Work of the Dead: Oblivion and Memory in Western Culture. I actually believe it to be broader than Western culture, but I want to be modest. The question I ask is, Why is it that we care for the dead body? We know that the dead body itself is just part of nature, that life has gone from it. But there’s a very long history going back to Paleolithic times of caring for the dead body, and people do it irrespective of what they believe about it. Socialists do it, Christians do it, Buddhists who believe that the body is irrelevant to the self do it. I’m interested in that puzzle.

Secondly, I’m interested in why we do it in practical ways: how dead bodies mark out borders and civilizations, how they seem to collect nations together. In other words, I want to argue that the living need the dead more than the dead need the living. The dead do all sorts of things for us.

I try to look at this in deep time, and I trace it through Christian traditions. I look at the 18th through the 20th centuries: how we find new places for the dead, how the cemetery replaces the churchyard, why, and how the cemetery is different from the churchyard. I explore why in the middle of the 19th century we start cremating people in these high-tech steel ovens that are borrowed from steelmaking technology, and what that means.

Lastly, I’m interested in the question of why, after so many centuries, millennia, of caring relatively little about the names of the dead, we’ve become so concerned about collecting them and marking landscapes with names, and putting names on memorials, and in general how we’ve come to think that every dead person has a name. Those are the big themes that I deal with.

In your book Making Sex, you insist on the idea that the body is culturally produced. You have a really nice phrase about how even though there is a body outside culture, we can access it only through culture. It seems that in relation to the dead, you can’t make that strong a claim. In some senses the biology of death—that bodies stop breathing, stop thinking, stop walking around—is absolute, and that’s an unassimilable fact that operates transculturally and transhistorically.

Sex operates like that too. Transhistorically, species that reproduce bisexually have two different sexes. So that’s the issue, how one understands and assimilates that fact into culture.

We know that Neanderthals buried, we know that some humans in the Paleolithic 25,000, 30,000 years ago buried. And as far as we have a record into the Upper Neolithic, very early settlements 7,000 to 8,000 years ago, people took care of the dead. So care of the dead is this moment in which the biological fact of death enters culture. Caring for the dead is like the incest taboo: It’s this moment, speaking theoretically, in which we move from nature into culture.

We care for the dead for all sorts of reasons, and each culture has made up many different reasons why it’s important. The dead are scary; the dead are scary for many reasons. The dead might be helped by the living and the living might be helped by the dead. The ultimate fact is that we care for the dead, and then we make up a bunch of reasons to justify that.

What strikes me as interesting is that we create communities with dead people that represent our communities, even though we know that what we’re burying is indistinguishable from anything else—the dead body of our friend is no different from the dead body of our enemy. But we believe it to be the dead body of a friend, and we invest meaning in that. We take nature, which is the dead body, and bring it into culture. That’s a very remarkable thing.

There are a bunch of tombs around Marx’s grave, and if an anthropologist were to dig it up a thousand years from now, they would think it looked exactly like a Sufi tomb, or a Catholic tomb, or a Hasidic tomb. Because somehow these dead materialist communists believe that their ashes produce a community with Karl Marx, even though their whole life philosophy suggests that’s rubbish. It’s not even his body in Highgate; it’s his ashes. But there’s a great Marx tomb that looks like there’s a body in it. There’s something equivalent in how we make gender out of sex. Everyone’s ashes are chemically indistinguishable, and yet ashes can make a community. If you were to say, okay, Marx’s ashes are there just to make a memorial, I would say that’s true, but it wouldn’t work without a body. If it were just a plaque with a name on it, it wouldn’t work. It’s similar to the Tomb of the Unknown Soldier; if you thought the body weren’t there, it wouldn’t work.

Even for people who explicitly believe that in death there’s nothing of the person left—for example, Marx was an Epicurean, and he thought that the atoms of your body returned to nature—the dead body matters. Mary Wollstonecraft’s husband writes very eloquently about what her dead body means to him. He says that, like her eyeglasses and her books are dear to him, all the more so her body, although he knows that she’s not in her body.

I was really interested in your description of cemeteries and how heavily they draw on a mishmash of ancient Egyptian architecture and other kinds. You also relate it to the colonial development of racial and national categories. Could you say more about how those things appear in these ornate cemeteries?

Before the Enlightenment, the only place you could get buried without ignominy was the churchyard. With the development of the cemetery, the dead make new kinds of communities.

In churchyards, the graves were all oriented east-west. In the English churchyard, there’s a particular botany: The yew tree is the tree of the churchyard. There’s only one kind of person in the churchyard: a Christian. There’s no private property in the churchyard, everything belongs to the community. Every monument has to have the approval of the parish priest, so you can’t build what you want there. The churchyard is a communal space, and it’s a space that belongs to the parish. Others can come in, but only by paying extra.

The cemetery is a place where (in theory) anyone can buy property, and you can lie next to anyone, and you can be buried in any direction you want to be buried. You can write anything you want on the tombstone. You can be any religion. If you’re Jewish, some rabbis might not be willing to bury you, but some rabbis would. It’s an open space, and people built according to the sensibilities of the day. Just like you can choose clothes that are slightly retro, the large cemeteries provided big tombs and they provided graves that looked like Egyptian tombs and Roman-style plinths, and you could present yourself in whatever way you wanted. The colonial cemetery in Calcutta looks like an imperial cemetery, it doesn’t look like a churchyard.

In Europe, every nation starts by producing national cemeteries. The first thing the Czechs do is produce a national cemetery. So the dead can produce all kinds of communities.

Race itself seems to have something to do with a long history of the dead.

In America there are segregated cemeteries, but it’s also the case with religious difference. In Northern Ireland, the Belfast cemetery has a Protestant and a Catholic section and it has a six-foot underground wall, so the purity of the Catholics and the purity of the Protestants won’t be violated. Communities of race and communities of servitude produce their own burial places.

The work of the dead is an immense feat of the human imagination. Not even the craziest 19th century racist argues that the actual bones of a dead black person are any different than the bones of a white person, and yet they won’t be buried next to each other. The dead are so crucial to making communities because they become paths to connect to the past. There was terrible uproar in Spain because the right wing has exhumed the bodies of Republicans and put them in the Franco memorial, which is seen as a monument to fascism. Yet the bodies don’t know the difference—they’re the same as the fascist bodies. Lorca has a wonderful line: “Nowhere are the dead more alive than Spain.” I don’t know if that’s especially true in Spain, but everywhere the dead are alive. We believe the dead to be alive whether we believe them to be ghosts or spirits or inhabit an afterlife. It’s immaterial whether or not we believe the dead are somewhere else—we take dead bodies to represent the dead and to matter and to be alive. A huge amount of life and culture is made manifest in the dead body.

We even believe the names to be alive. At the Vietnam Memorial people leave cigarettes and beer at the name of someone they loved. Even though if you actually asked them, do you actually believe in grave goods? Do you believe that there’s a ghost that will drink the beer? None of that. There’s no checklist of beliefs. But the name represents some version of the immortality or the presence of the person.

Is it a peculiarity of capitalist or white bourgeois society that it doesn’t specify any particular relation to the dead? Like you said about the people who leave beer at the Vietnam memorial, there’s no metaphysics that substantiates why they do it. That seems like a weird thing about modernity.

There aren’t many cultures that have a more engaged relation to the dead than the high capitalist or Victorian age. I think that now people are immensely engaged with the dead. There are endless battles about where someone can be buried. “Dzhokhar Tsarnaev can’t be in my cemetery.” Or people ingest ashes of their loved ones or get tattoos with ink made from the ashes of their loved ones. There are endless stories.

I think the bourgeoisie is if anything more actively engaged with the dead, though it’s hard to quantify. I certainly don’t believe that we’re not engaged with the dead. If you think of the dead of conflict, of the Holocaust, of war—they are crucial in our culture. In the high era of imperialist capitalism, the Unknown Warrior becomes a shrine in every European country. And there’s no theology that justifies it.

I would say in some sense that bourgeois culture is more engaged with the dead than before because it’s a way to deal with the anxiety of “all that’s solid melts into the air.” It doesn’t melt into air—it stops with the dead. Many bourgeois conventions, rituals, and gestures are to make it stop. I don’t believe the line that we ever stopped making the dead central in all sorts of cultures.

As you get more and more violent mass death, with colonialism and the wars of the 20th century, I was thinking about how unbearable it would be for the agents of this colonial culture to have the dead as this malevolent force. What if we did believe that the dead who died badly had some kind of presence? Of course symbolically the dead are very active, but we don’t really believe that the ghosts of people who died in concentration camps are haunting Angela Merkel. But we could. I was thinking about how the refusal of ­certain forms of death is also a convenient refusal of certain forms of guilt.

How long have ghosts ever haunted anyone? Ghosts haunt historically for relatively short amounts of time. People complained about the Holocaust memorial in Berlin exactly because you don’t want the ghosts of the Holocaust dead in the middle of the city. I think the ghosts of the Holocaust have survived for longer in Germany than ghosts of previous injustices have survived anywhere else.

The colonial issue is another one. The unjustly dead of the British have not survived. That’s an interesting question, and I think it has to do with an idea of colonial power. You can kill the ghosts—that’s the imperialist fantasy. The Roman idea that you could actually kill everyone in Jerusalem or Carthage and that would be the end of them. They would not come back. That is the fantasy of colonial power, and it sometimes works and sometimes doesn’t.

Is there a relation between death and gender?

I wanted to say at first, my book is mostly about men. Because of patriarchy, civilization is carried through men, and there’s less care for the dead women. But I think the truth of the matter is that it varies enormously from culture to culture and situation to situation. Sometimes archaeologically they find fewer women’s graves, because there are fewer women, like in the frontier societies of the Vikings. Sometimes there are equal numbers of women and women are cared for just as much as men.

I think that many public monuments and public presentations of the dead are about men because men die in war, and many of the great monuments to the dead are created in war. And “great men” have to be men, in patriarchal societies. But is there a dramatic difference in care of the dead by gender? Probably not. But that’s a hard thing to nail down. You can’t make the blanket statement that women’s bodies are less cared for in death, in general. The experts on the deep history of the dead aren’t clear on this. 

The masculine/feminine binary relates women to birth, death and this murky substrate, more associated with nature—and you talked about the dead body returning to nature. What do you think about that possible connection between women and death?

Mourners tend to be women, and you could argue that this is symbolically because women bring life into the world. In Jewish culture and in many other cultures, the dead body and menstrual blood are part of the same pollution system. In my work, I’ve tended to not deal with this so much because I don’t think it has a history and I don’t think it has a way to be empirically studied. What my work does is to say, look, there’s a universal history of the dead, but how we care for the dead in particular places is a consequence of particular social and cultural situations, not of theology, and not of ideology. The dead make communities, the dead work for the living. Sometimes it’s the bodies of dead women, sometimes of dead men, and sometimes it’s indifferent.

How does all this relate to either the personal experience of bereavement as an absolute loss, or our own awareness that we ourselves are going to die one day?

The whole notion is that we care for the dead body because the dead body of a friend or a loved one is significant, even though we know they are gone. In mourning, there are different stages. The body or some version of the body, or some sense that the body is somewhere, is crucial to mourning, both in the acute stages of acute loss and in the long stages of maintaining family connections and genealogy.

At some point, the dead will fade away—probably within decades. But yes, it’s about mourning. It’s about acute mourning or longer mourning. Whether you put the ashes in a river, whether you put the ashes in a burial place, or where you put the body, is crucial to mourning.

While the dead are gone, they’re not gone. While the dead don’t speak, they speak. St. Paul said that and we can say it now: The dead don’t speak and yet we hear them speak. We hear them speak in St. Paul, we hear them speak in the poetry of Thomas Hardy. They speak, and they chastise us and they say loving things to us, they say all sorts of things to us. And they say it from where the body is, usually. So everyone in mourning believes their dead aren’t gone, they’re somewhere, and they’re something. That’s why the whole thing works. A dog doesn’t have a sense that a dead dog is anywhere, but humans believe that the dead are somewhere.

But animals can mourn, dogs can mourn. People have seen elephants mourning.

It’s very brief. It’s not over the long term. Humans are the only creatures who produce culture around the dead.

Link: Death Stares

By Facebook’s 10th anniversary in February 2014, the site claimed well over a billion active users. Embedded among those active accounts, however, are the profiles of the dead: nearly anyone with a Facebook account catches glimpses of digital ghosts, as dead friends’ accounts flicker past in the News Feed. As users of social media age, it is inevitable that interacting with the dead will become part of our everyday mediated encounters. Some estimates claim that 30 million Facebook profiles belong to dead users, at times making it hard to distinguish between the living and the dead online. While some profiles have been “memorialized,” meaning that they are essentially frozen in time and only searchable to Facebook friends, other accounts continue on as before.

In an infamous Canadian case, a young woman’s obituary photograph later appeared in a dating website’s advertising on Facebook. Her parents were rightly horrified by this breach of privacy, particularly because her suicide was prompted by cyberbullying following a gang rape. But digital images, once we put them out into the world on social networking platforms (or just on iPhones, as recent findings about the NSA make clear), are open to circulation, reproduction, and alteration. Digital images’ meanings can change just as easily as Snapchat photographs appear and fade. This seems less objectionable when the images being shared are of yesterday’s craft cocktail, but having images of funerals and corpses escape our control seems unpalatable.

While images of death and destruction routinely bombard us on 24-hour cable news networks, images of death may make us uncomfortable when they emerge from the private sphere, or are generated for semi-public viewing on social networking websites. As I check my Twitter feed while writing this essay, a gruesome image of a 3-year-old Palestinian girl murdered by Israeli troops has well over a thousand retweets, indicating that squeamishness about death does not extend to international news events. By contrast, when a mother of four posted photographs of her body post cancer treatments, mastectomy scars fully visible, she purportedly lost over one hundred of her Facebook friends who were put off by this display. To place carefully chosen images and text on a Facebook memorial page is one thing, but to post photographs of a deceased friend in her coffin or on her deathbed is quite another. For social media users accustomed to seeing stylized profiles, images of decay cut through the illusion of curation.

In a 2009 letter to the British Medical Journal a doctor commented on a man using a mobile phone to photograph a newly dead family member, pointing out with apparent distaste that Victorian postmortem portraits “were not candid shots of an unprepared still warm body.” He wonders, “Is the comparatively covert and instant nature of the mobile phone camera allowing people to respond to stress in a way that comforts them, but society may deem unacceptable and morbid?” While the horrified doctor saw a major discrepancy between Victorian postmortem photographs and the one his patient’s family member took, Victorian images were not always pristine. Signs of decay, illness, or struggle are visible in many of the photographs. Sickness or the act of dying, too, was depicted in these photos, not unlike the practices of deathbed tweeting and illness blogging. Even famous writersand artists were photographed on their deathbeds.

Photography has always been connected to death, both in theory and practice. For Roland Barthes, the photograph is That-has-been. To take a photo of oneself, to pose and press a button, is to declare one’s thereness while simultaneously hinting at your eventual death. The photograph is always “literally an emanation of the referent” and a process of mortification, of turning a subject into an object — a dead thing. Susan Sontag claimed that all photographs are memento mori, while Eduardo Cadava said that all photographs are farewells.

The perceived creepiness of postmortem photography has to do with the uncanniness of ambiguity: Is the photographed subject alive or dead? Painted eyes and artificially rosy cheeks, lifelike positions, and other additions made postmortem subjects seem more asleep than dead. Because of its ability to materialize and capture, photography both mortifies and reanimates its subjects. Not just photography, but other container technologies like phonographs and inscription tools can induce the same effects. Digital technology is another incarnation of these processes, as social networking profiles, email accounts, and blogs become new means of concretizing and preserving affective bonds. Online profiles and digital photographs share with postmortem photographs this uncanny quality of blurring the boundaries between life and death, animate and inanimate, or permanence and ephemerality.

Sharing postmortem photos or mourning selfies on social media platforms may seem creepy, but death photos were not always politely confined to such depersonalized sources as mass media. Postmortem and mourning photography were once accepted or even expected forms of bereavement, not callously dismissed as TMI. Victorians circulated images of dead loved ones on cabinet cards or cartes de visite, even if they could not reach as wide a public audience as those who now post on Instagram and Twitter. Photography historian Gregory Batchen notes that postmortem and mourning images were “displayed in parlors or living rooms or as part of everyday attire, these objects occupied a liminal space between public and private. They were, in other words, meant to do their work over and over again, and to be seen by both intimates and strangers.”

Victorian postmortem photography captured dead bodies in a variety of positions, including sleeping, sitting in a chair, lying in a coffin, or even standing with loved ones. Thousands of postmortem and mourning images from the 19th and early 20th centuries persist in archives and private collections, some of them bearing a striking resemblance to present day images. The Thanatos Archive in Woodinville, Washington, contains thousands of mourning and postmortem images from the 19th century. In one Civil War-era mourning photograph, a beautiful young women in white looks at the camera, not dissimilar to the images of the coiffed young women on Selfies at Funerals. In another image, a young woman in black holds a handkerchief to her face, an almost exaggerated gesture of mourning that the comically excessive pouting found in many funeral selfies recalls. In an earlier daguerreotype, a young woman in black holds two portraits of presumably deceased men.

Batchen describes Victorian mourners as people who “wanted to be remembered as remembering.” Many posed while holding photographs of dead loved ones or standing next to their coffins. Photographs from the 19th century feature women dressed in ornate mourning clothes, staring solemnly at photographs of dead loved ones. The photograph and braided ornaments made from hair of the deceased acted as metonymic devices, connecting the mourner in a physical way to the absent loved one, while ornate mourning wear, ritual, and the addition of paint or collage elements to mourning photographs left material traces of loss and remembrance.

Because photographs were time-consuming and expensive to produce in the Victorian era, middle-class families reserved portraits for special events. With the high rate of childhood mortality, families often had only one chance to photograph their children: as memento mori. Childhood mortality rates in the United States, while still higher than many other industrialized nations, are now significantly lower, meaning that images of dead children are startling. For those who do lose children today, however, the service Now I Lay Me Down to Sleep produces postmortem and deathbed photographs of terminally ill children.

Memorial photography is no mere morbid remnant of a Victorian past. Through his ethnographic fieldwork in rural Pennsylvania, anthropologist Jay Ruby uncovered a surprising amount of postmortem photography practices in the contemporary U.S. Because of the stigma associated with postmortem photography, however, most of his informants expressed their desire to keep such photographs private or even secret. Even if these practices continue, they have moved underground. Unlike the arduous photographic process of the 19th century, which could require living subjects to sit disciplined by metal rods to keep them from blurring in the finished image, smartphones and digital photography allow images to be taken quickly or even surreptitiously. Rather than calling on a professional photographer’s cumbersome equipment, grieving family members can use their own devices to secure the shadows of dead loved ones. While wearing jewelry made of human hair is less acceptable now (though people do make their loved ones into cremation diamonds), we may instead use digital avenues to leave material traces of mourning.

Why did these practices disappear from public view? In the 19th century, mourning and death were part of everyday life but by the middle of the 20th century, outward signs of grief were considered pathological and most middle-class Americans shied away from earlier practices, as numerous funeral industry experts and theorists have argued. Once families washed and prepared their loved ones’ bodies for burial; now care of the dead has been outsourced to corporatized funeral homes.

This is partly a result of attempts to deal with the catastrophic losses of the First and Second World Wars, when proper bereavement included separating oneself from the dead. Influenced by Freudian psychoanalysis’s categorization of grief as pathological, psychologists from the 1920s through the 1950s associated prolonged grief with mental instability, advising mourners to “get over” loss. Also, with the advent of antibiotics and vaccines for once common childhood killers like polio, the visibility of death in everyday life lessened. The changing economy and beginnings of post-Fordism contributed to these changes as well, as care work and other forms of affective labor moved from the domicile to commercial enterprises. Jessica Mitford’s influential 1963 book, The American Way of Death, traces the movement of death care from homes to local funeral parlors to national franchises, showing how funeral directors take advantage of grieving families by selling exorbitant coffins and other death accoutrements. Secularization is also a contributing factor, as elaborate death rituals faded from public life. While death and grief reentered the public discourse in the 1960s and 1970s, the medicalization of death and growth of nursing homes and hospice centers meant that many individuals only saw dead people as prepared and embalmed corpses at wakes and open casket funerals.

Despite this, reports of a general “death taboo” have been greatly exaggerated. Memorial traces are actually everywhere, prompting American Studies scholar Erika Doss to dub this the age of “memorial mania.” Various national traumas have led to numerous memorials, both online and physical, and likewise, on social media, including tactile examples like the AIDS memorial quilt, large physical structures like the 9/11 memorial, long-standing online entities like sitesremembering Columbine, and more recent localized memorials dedicated to the dead on social networking websites.

But these types of memorials did not immediately normalize washing, burying, or photographing the body of a loved one. There’s a disconnect between the shiny and seemingly disembodied memorials on social media platforms and the presence of the corpse, particularly one that has not been embalmed or prepared.

Some recent movements in the mortuary world call for acknowledgement of the body’s decay rather than relying on disembodied forms of memorialization and remembrance. Rather than outsourcing embalmment to a funeral home, proponents of green funerals from such organizations as the Order of the Good Death and the Death Salon call for direct engagement with the dead body, learning to care for and  even bury dead loved ones at home. The Order of the Good Death advises individuals to embrace death: “The Order is about making death a part of your life. That means committing to staring down your death fears — whether it be your own death, the death of those you love, the pain of dying, the afterlife (or lack thereof), grief, corpses, bodily decomposition, or all of the above. Accepting that death itself is natural, but the death anxiety and terror of modern culture are not.”

The practices having to do with “digital media” and death that some find unsettling — including placing QR codes on headstones, using social media websites as mourning platforms, snapping photos of dead relatives on smartphones, funeral selfies, and illness blogging or deathbed tweeting— may be seen as attempts to do just that, materializing death and mourning much like Victorian postmortem photography or mourning hair jewelry. Much has been made of the loss of indexicality with digital images, which replace this physical process of emanation with flattened information, but this development doesn’t obviate the relationship between photography and death. For those experiencing loss, the ability to materialize their mourning — even in digital forms — is comforting rather than macabre.

Link: The Way of All Flesh: On Tolstoy and Mortality

You probably won’t be around for your death, and it’s probably all right that you miss it. In Middlemarch, Edward Casaubon’s death is another of life’s myriad experiences, albeit a “commonplace” one that becomes both an abomination and an act of imagination—​one’s mind plays tricks, including spiritual tricks, as the mind and body die. In Thomas Mann’s Magic Mountain, doctor Hofrat Behrens tries to comfort the mother of noble young officer Joachim Ziemssen, who lies dying in a sanatorium: “We come out of the dark and go into the dark again, and in between lie the experiences of our life. But the beginning and the end, birth and death, we do not experience; they have no subjective character.” Samuel Johnson told James Boswell, in typical Johnsonian fashion, that it simply doesn’t matter how a man dies, only how he lives, because dying doesn’t last that long. Unless, of course, it does, and for Leo Tolstoy’s character Ivan Ilyich, dying is a protracted process that assumes just as much importance as living—​a process that indeed takes meaning away from or gives it to the life lived.

Who but Vladimir Nabokov, in his peerless Lectures on Russian Literature, could have noticed that “Ilyich” is pronounced ill-​itch—​“the ills and itches of mortal life.” Nabokov was clear in pointing out that Tolstoy’s famous novella is not about Ivan’s death, but about his life (despite the fact that less than a quarter of the novella is devoted to Ivan’s life). Nabokov dubs the story Tolstoy’s “most artistic, most perfect, and most sophisticated achievement,” and that, ladies and gentlemen, is saying quite a lot. The esteemed Tolstoy biographer Henri Troyat called The Death of Ivan Ilyich a “double story of the decomposing body and awakening soul.” This double quality, this wedding of antithetical forces, is part of what contributes to the immortal force ofIvan Ilyich. The binary of the soul’s ascension and the body’s decline works only if, as Nabokov asserts, the story becomes about proper living instead of inevitable dying. Johnson meant that dying wasn’t important because there was nothing that could be learned from it, nowhere to go after it: As an experience it’s worthless, which is exactly what Ludwig Wittgenstein suggested when he wrote in his Tractatus, “Death is not an event of life. Death is not lived through.”

Peter Carson’s new translation of The Death of Ivan Ilyich—​for the first time paired with Tolstoy’s devastating spiritual memoir Confession—​has its own double story: As Ilyich was dying on the page, Carson was dying at his desk, besieged by the late-​stage cancer that would kill him. A revered English publisher, editor-​in-​chief of Penguin and then Profile Books, Carson was also the translator of Ivan Turgenev’s imperishable novel Fathers and Sons. Classicist Mary Beard, in her touching introduction to this volume, writes that Carson was “one of the finest translators there has ever been of nineteenth century Russian literature.” After his cancer death sentence in 2012, he left Profile and toiled full time on Tolstoy’s two classics, and it’s impossible not to imagine that this urgent task served as Carson’s own spiritual bulwark against the despair of his fate. How determined he must have been to complete this task—​his final life’s work—​even as he felt himself corroding daily from the disease. Carson isn’t the only scholar who chose to spend his last mile working on the complexities of Count Tolstoy: The historian William Shirer—​author of The Rise and Fall of the Third Reich—​died in 1993 just after he completed Love and Hatred: The Stormy Marriage of Leo and Sonya Tolstoy

Peter Carson has composed translations so nuanced and potent they are sure to be the benchmark for decades to come. His ingenious decision to pair these important narratives allows us the privilege of apprehending them as Tolstoy intended, because even for the most secular among us, dying can never be entirely devoid of spiritual yearning. Carson died on January 9, 2013, at the age of seventy-​four, having finished both translations just two days earlier. As Beard tells us, the last lines of Ivan Ilyich,translated by Carson himself, were read aloud at his funeral:

“It is finished!” someone said above him.

He heard these words and repeated them in his heart. “Death is finished,” he said to himself. “It is no more.” 

He breathed in, stopped halfway, stretched himself, and died.

Death is finished. It’s an extraordinary statement, wholly different from saying I am finished, and one akin to John Donne’s unforgettable image in his Devotions: “When one man dies, one chapter is not torn out of a book, but translated into a better language; and every chapter must be so translated.” If Donne had been available in Russian, Tolstoy would have admired his feat of imagining—​lives translated into better lives—​and would have found much of his own artistic/spiritual logic in Donne’s most famous sonnet, “Death Be Not Proud.”

Almost all the English translations ofIvan Ilyich prettify Tolstoy’s gnarled syntax and staccato cadences in an attempt to make him a smoother, more lyrical storyteller. Carson remains exceedingly loyal to the Russian original, to that element in Tolstoy which Nabokov dubbed “the groping purist”: Tolstoy “unwraps the verbal parcel for its inner sense, he peels the apple of the phrase, he tries to say it one way, then a better way, he gropes, he stalls, he toys, he Tolstoys with words.” This Tolstoying with words can make for some syntactical tangles and repetitions, stop-​and-​go paragraphs wanting in fluidity. The simplicity of Tolstoy’s plot—​an ordinary judge falls from a ladder, bumps his side, becomes ill, and for months lies on a sofa dying in agony—​and the almost childlike simplicity of Tolstoy’s style—​“Ivan llyich’s past life had been very simple and ordinary and very awful”—​are, as in Ernest Hemingway, deceptive simplicities. Nabokov reminds us that “no major writer is simple… . Mom is simple. Digests are simple. Damnation is simple. But Tolstoys and Melvilles are not simple.”

Some context is in order. After he completed Anna Karenina in 1877, Tolstoy experienced what we might call a nervous breakdown—​a religious crisis which led him to abandon fiction and become a soapboxer, an aspiring saint, a preacher of austerity and fulminator against Orthodoxy. The conversion occurred at the steep expense of friends and his family’s harmony. Turgenev, for one, was by turns befuddled and appalled by Tolstoy’s conversion, while Sonya Tolstoy and their children refused to follow in his ascetic new beliefs, though many around the world would do just that, and give those beliefs a name, too: Tolstoyism—​an iffy doctrine hostile to both Church and State, an Earthly enactment of God’s plan to be realized through peaceful rebellion. Tolstoy could hardly have claimed surprise or displeasure when his soapboxing erupted into a worldwide doctrine, since as a young man in his midtwenties he noted in his diary that he wanted to found a fresh religion, a “religion of Christ but purged of dogmas and mysticism.” It’s true that Tolstoy’s religious writings lack the mystical blather that so titillated a thinker such as Dietrich Bonhoeffer, but Tolstoy never acknowledged that any spiritual program, no matter how divorced from institution, is organically susceptible to dogma. 

Written in 1886, The Death of Ivan Ilyich was the first fiction Tolstoy published after the spiritual upheaval he chronicles in Confession. It’s easy to imagine Ilyich as the old and bearded sage-​looking man Tolstoy was upon his death, but he’s only forty-​five years old, and this fact adds to the tremendous pathos of the story: The death of a young man is always more awful than the death of an old man. The priest gives Ilyich little spiritual consolation, and the doctors are self-​important fools, incapable of mitigating his pain. His co-​workers are disgusted by the thought of his wasting body and care only about jockeying for cozier positions once he dies. His wife and children, occupied by the minutiae of their quotidian lives, refuse to admit what has befallen him. He finds their refusal to confront this fanged truth most disgusting of all: “Ivan Ilyich’s chief torment was the lie—​that lie, for some reason recognized by everyone, that he was only ill but not dying.” His sole comfort comes from Gerasim, the peasant servant who does not recoil from the foul stench, who accepts the inevitability of all flesh. If Ilyich’s upper-​crust friends regard death as indecent, Gerasim knows otherwise: His peasant’s dirty-​hands understanding of life, his calm acceptance of every person’s fate, helps to calm Ilyich into his own acceptance. (The peasantry’s calm acceptance of death, by the way, can be noticed in Turgenev, Dostoevsky, and Solzhenitsyn, to name a few—​it seems to fall somewhere in line between Russian literary trope and Russian cultural myth.) Relief for Ilyich comes only after he has followed Gerasim’s lead and acquiesced to his fate.

Much has been written about exactly what disease or injury afflicts Ilyich—​Troyat is certain it’s stomach cancer—​but the narrative makes clear that the name or physical nature of the affliction matters not at all. Some have read Ilyich’s ordeal as a manifestation of a profoundly sick society wed to Mammon, an indication that an entire culture has been corroded by hedonism and greed. Nadine Gordimer has written that Ilyich “was fatally sickened by his times.” Philip Rahv, in an inspired piece on both Ivan Ilyich and Franz Kafka’s novel The Trial, wrote: “As to the mysterious catastrophe which destroys Ilyich, what is it in historical reality if not the ghost of the old idealism of status returning to avenge itself on its murderer? Through Ilyich’s death the expropriators are expropriated.” What we behold in Joseph K. and Ilyich, says Rahv, is “the historic depletion of man.” It’s important to remember the essence of Tolstoy’s ideology when he was composing Ivan Ilyich: The uncomplicated of-​the-​land peasantry was the paragon of human living, while the city-​poisoned bourgeoisie was submerged in the spiritual quicksand of its own rampant materialism. 

Rahv has written that this novella “would be utterly pointless if it were to see Ivan Ilyich as a special type and what happened to him as anything out of the ordinary. Ivan Ilyich is Every-man.” The literary scholar Victor Brombert agrees; in his lovely new book Musings on Mortality: From Tolstoy to Primo Levi, he writes of Ivan Ilyich: “It is hard to imagine a more unremarkable first name and patronymic. It is like calling the protagonist John Smith or Every-man. And nothing could be more common or widespread than death.” But it’s mistaken to think that every person experiences death precisely as Ilyich does, especially when you heed Nabokov’s injunction to see the story as about his spiritually vacant and frivolous life. Despite the universality of his predicament, Ilyich is no Everyman because not everyone spends his final months remorseful over a misguided life. Ivan Ilyich is, rather, more like Count Tolstoy himself: probing, railing, regretful, conflicted, intransigent to the last. 

Because Ilyich “sees the light,” as the cliché has it, because he comes to comprehend that his existence has been in error, the story amounts to a confirmation of the Christian paradox that one must die in order to live, that one’s true life—​true because eternal—​begins at death. Scholars have noted, too, that the ending of Ivan Ilyichsmacks of Christ’s crucifixion: Ilyich’s final agonizing stretch of three days, his exacerbated inquiry, “Why, why do you torment me so horribly?” an unambiguous echo of Christ’s famous “Why hast Thou forsaken me?” Brombert skillfully shows how “the transition from chapter 6 to chapter 9 closely parallels the transition from the sixth to the ninth hour of the Crucifixion.” All this Christian special pleading makes for a convenient ending, both too hasty and too tidy. Worse, it smells suspiciously of propaganda—​the narrative tortures a man only so that he can receive the deliverance which was, we can’t help but see, a forgone conclusion. Worse still, it’s an obese bromide: One must travel through hell to reach heaven? This is what happens when the fiction writer allows himself to be breathed on by the pamphleteer.

Some scholars view Tolstoy’s spiritual crisis as a rupture in his creativity, his conversion as destruction, but look closely at Tolstoy’s fiction prior to 1878 and you’ll see that the rupture was no such thing, that the quartet of spiritual books he produced from 1878 to 1882—​Confession, Critique of Dogmatic Theology, Harmony and Translation of the Four Gospels, and What I Believe—​was penned by the same creative hand which penned War and Peace and Anna Karenina. The question, How can one live without despair? crops up everywhere in the work, and one suspects that without his tremendous worldly success Tolstoy would have easily morphed into Kafka, overwhelmed by every breath, stomped under the shoe of existence. What’s more, his intense fear and contemplation of death and dying was not unique to the post-​conversion period. Five of his thirteen children died before their tenth birthdays—​never underestimate how such calamity can warp even the most stoic of men. Tolstoy was crushed by his brother Nikolai’s death from consumption in 1860 (he also visited Anton Chekhov during the younger writer’s dying from the same disease). As early as 1869 he experienced what Maxim Gorky named the “Arsamasian Terror”: During the night in a hotel in Arzamas, Tolstoy woke suddenly with a cutting dread of death and the certain knowledge that living was futile (he’d been eyebrow-​deep in the philosophy of Arthur Schopenhauer, which explains much). One of the central meanings of War and Peace is how human beings have as much control of history as they have of their own mortality. And look at the dying of Nikolai Levin in Anna Karenina to see how similar it is to the dying of Ivan Ilyich: 

His sufferings, growing more and more severe, did their work and prepared him for death… . Hitherto each individual desire aroused by suffering or privation, such as hunger, fatigue, thirst, had brought enjoyment when gratified. But now privation and suffering were not followed by relief, and the effort to obtain relief only occasioned fresh suffering. And so all desires were merged in one—​the desire to be rid of all this pain and from its source, the body. But he had no words to express this desire for deliverance, and so he did not speak of it.

Here’s what Meursault contemplates in Albert Camus’s Stranger while he’s waiting in prison to be executed for murder, a contemplation at the very crux of the spiritual disaster in Tolstoy’s Confession

Deep down I knew perfectly well that it doesn’t much matter whether you die at thirty or at seventy, since in either case other men and women will naturally go on living… . Whether it was now or twenty years from now, I would still be the one dying. At that point what would disturb my train of thought was the terrifying leap I would feel my heart take at the idea of having twenty more years of life ahead of me. But I simply had to stifle it by imagining what I’d be thinking in twenty years when it would all come down to the same thing anyway. Since we’re all going to die, it’s obvious that when and how don’t matter.

Never the low aimer, Tolstoy called his memoir Confession with both Saint Augustine and Jean-​Jacques Rousseau in mind. From grievous detail to grievous detail—​“life is nonsense … nothing ahead but doom … complete annihilation”—​the book recounts Tolstoy’s hard path between spurious “Church” belief and “true” Christian belief, one denuded of officiating and ostentation. Orthodoxy is “stupid, cruel, and immoral”—​replace “Orthodoxy” with “Catholicism” and Tolstoy has more in common with Martin Luther than he would have dared admit. In his masterwork A History of Russian Literature, D. S. Mirsky calls Confession “the greatest piece of oratory in Russian literature,” and while that might be a bit of hyperbole, Confession does boast an oratorical acuity all the more remarkable because it pretends to do no such thing. The question at its core is this: “Is there any meaning in my life that wouldn’t be destroyed by the death that inevitably awaits me?” It’s the very question—​the very horror—​that pesters Ivan Ilyich during his months-​long agon against death. And Camus must have had these lines in mind when he was composing Meursault’s demise: “You can only live as long as you’re drunk with life; but when you sober up, you can’t help but see that all this is just a fraud, and a stupid fraud. Precisely that: there’s nothing even amusing or witty about it; it’s simply cruel and stupid.”

And so the great man searched. Schopenhauer, Solomon, and Buddha offered no solace.Scientific rationalism was a coffin for his soul. Others of his own class and education had no clue. Then, in a suicidal stupor, he began to see that the supernaturalism and irrationality of faith, and all the vulgate attached to it, wasn’t so stupid after all: “It alone gives mankind answers to the questions of life and consequently the possibility of living.” Writing War and Peace and Anna Kareninawasn’t enough; the love of his wife wasn’t enough; the lives of his children weren’t enough; Leo Tolstoy also had to have an invitation from the infinite. And those who mailed him this invitation to the infinite were the peasants—​because, like Gerasim inIvan Ilyich and unlike all the poseurs from Tolstoy’s own set, the peasants didn’t pretend. Their beliefs weren’t disconnected from their lives; their superstitions were meaningful because they enhanced happiness. Furthermore, their privation and ceaseless hardship were not sources of wonder or remorse—​they accepted existence as it was. And by accepting existence as it was they accepted its cessation too. Tolstoy’s rabid dread of death turned him into something of a slummer: This genius with deep wealth and unmatched renown tried unsuccessfully to embrace privation and even took to wearing the peasant’s traditional dress. But it’s one thing to wear their clothes; quite another to live their lives.

In the powerful conclusion of his long essay on Tolstoy, The Hedgehog and the Fox, Isaiah Berlin describes the agony of Tolstoy’s dying as an inability to resolve “the conflict of what there is with what there ought to be.” For Tolstoy, what ought to have been was his physical immortality—​his death struck him as an unconscionable affront to the cosmic order. How could an intelligence and imagination that vast ever die? He succumbed to pneumonia at the age of eighty-​two at a railway station in Astapovo, a far-​off Russian village. He’d fled his dismaying and dismayed wife and their estate, called Yasnaya Polyana, ten days earlier. “I am doing what old men of my age usually do,” he wrote in his farewell letter to Sonya; “leaving worldly life to spend the last days of my life in solitude and quiet.” He died in the care of his daughter Sasha, his family doctor, and a probably awe-​smacked peasant stationmaster—​his Gerasim—​while Sonya was forbidden to see her husband of forty-​eight years. (There’s a famous, heart-​stabbing photo of her peering into a window of the modest home where her husband lay dying. There’s also a novel by Jay Parini called The Last Station which beautifully imagines the torture of the Tolstoys’ final year.) So eminent was Tolstoy that many hundreds, including the Tsar’s operatives and a battalion of reporters, descended on Astapovo and created an international commotion. It’s highly unlikely that Leo Tolstoy, even in his weakened and addled state, wasn’t aware of how his life had just transformed into his fiction, of how by some creative miracle he had augured this very demise twenty-​four years earlier in The Death of Ivan Ilyich.

Link: Death and Madness at Diamond Mountain: Buddhism in the West

People come from all over the world to Arizona’s Diamond Mountain University, hoping to master Tibetan teachings and achieve peace of mind. For some, the search for enlightenment can go terribly wrong.

Ian Thorson was dying of dehydration on an Arizona mountaintop, and his wife, Christie McNally, didn’t think he was going to make it. At six in the morning she pressed the red SOS button on an emergency satellite beacon. Five hours later a search-and-rescue helicopter thumped its way to the stranded couple. Paramedics with medical supplies rappelled off the hovering aircraft, but Thorson was already dead when they arrived. McNally required hospitalization. The two had endured the elements inside a tiny, hollowed-out cave for nearly two months. To keep the howling winds and freak snowstorms at bay, they had dismantled a tent and covered the cave entrance with the loose cloth. Fifty yards below, in a cleft in the rock face, they had stashed a few plastic tubs filled with supplies. Even though they considered themselves Buddhists in the Tibetan tradition, an oversize book on the Hindu goddess Kali lay on the cave floor. When they moved there, McNally and Thorson saw the cave as a spiritual refuge in the tradition of the great Himalayan masters. Their plan was as elegant as it was treacherous: They would occupy the cave until they achieved enlightenment. They didn’t expect they might die trying.

Almost irrespective of the actual spiritual practices on the Himalayan plateau, the West’s fascination with all things Tibetan has spawned movies, spiritual studios, charity rock concerts and best-selling books that range from dense philosophical texts to self-help guides and methods to Buddha-fy your business. It seems as if almost everyone has tried a spiritual practice that originated in Asia, either through a yoga class, quiet meditation or just repeating the syllable om to calm down. For many, the East is an antidote to Western anomie, a holistic counterpoint to our chaotic lives. We don stretchy pants, roll out yoga mats and hit the meditation cushion on the same day that we argue about our cell phone bill with someone in an Indian call center. Still, we look to Asian wisdom to center ourselves, to decompress and to block off time to think about life’s bigger questions. We trust that the teachings are authentic and hold the key to some hidden truth. We forget that the techniques we practice today in superheated yoga studios and air-conditioned halls originated in foreign lands and feudal times that would be unrecognizable to our modern eyes: eras when princely states went to war over small points of honor, priests dictated social policy and sending a seven-year-old to live out his life in a monastery was considered perfectly ordinary.

Yoga, meditation, chakra breathing and chanting are powerful physical and mental exercises that can have profound effects on health and well-being. On their own they are neither good nor bad, but like powerful lifesaving drugs, they also have the potential to cause great harm. As the scholar Paul Hackett of Columbia University once told me, “People are mixing and matching religious systems like Legos. And the next thing you know, they have some fairly powerful psychological and physical practices contributing to whatever idiosyncratic attitude they’ve come to. It is no surprise people go insane.” No idea out of Asia has as much power to capture our attention as enlightenment. It is a goal we strive toward, a sort of perfection of the soul, mind and body in which every action is precise and meaningful. For Tibetans seeking enlightenment, the focus is on the process. Americans, for whatever reason, search for inner peace as though they’re competing in a sporting event. Thorson and McNally pursued it with the sort of gusto that could break a sprinter’s leg. And they weren’t alone. More than just the tragedy of obscure meditators who went off the rails in nowhere Arizona, Thorson’s death holds lessons for anyone seeking spiritual solace in an unfamiliar faith.

Until February 2012, McNally and Thorson were rising stars among a small community of Tibetan Buddhist meditators and yoga practitioners who had come to the desert to escape the scrutiny and chaos of the city in order to focus on spiritual development. McNally was a founding member of Diamond Mountain University and Retreat Center – a small campus of yurts, campers, temples and retreat cabins that sprawls over two rocky valleys adjacent to historic Fort Bowie in Arizona. In the past decade Diamond Mountain has risen from obscurity to become one of the best known, if controversial, centers for Tibetan Buddhism in the United States. Its supreme spiritual leader is Michael Roach, an Arizona native, Princeton graduate and former diamond merchant who took up monk’s robes in the 1980s and remains one of this country’s most enthusiastic evangelists for Tibetan Buddhism. McNally was Roach’s most devoted student, his lover, his spiritual consort and, eventually, someone he recognized as a living goddess. For 14 months McNally led one of the most ambitious meditation retreats in the Western world. Starting in December 2010 she and 38 other retreat participants pledged to cut off all direct contact with the rest of the planet and meditate under vows of silence for three years, three months and three days. Unwilling to speak, they wrote down all their communications. Phone lines, airconditioning and the Internet were off-limits.

The only way they could communicate with their families was through postal drops once every two weeks. The strict measures were intended to remove the distractions that infiltrate everyday life and allow the retreatants a measure of quiet to focus on the structure of their minds. Thorson’s death might have gone unnoticed by the world if, days after, Matthew Remski, a yoga instructor, Internet activist and former member of the group, had not begun to raise questions about the retreat’s safety on the well-known Buddhist blog Elephant Journal. He called for Roach to step down from Diamond Mountain’s board of directors and for state psychologists to evaluate the remaining 30-odd retreatants. His posting received a deluge of responses from current and former members, some of whom alleged sexual misconduct by Roach and made accusations of black magic and mind control.

Roach rose to prominence in the late 1990s after the great but financially impoverished Tibetan monastery Sera Mey conferred on him a geshe degree, the highest academic qualification in Tibetan Buddhism. Conversant in Russian, Sanskrit and Tibetan, he was an ideal messenger to bring Buddhism to the West and was widely acclaimed for his ability to translate complex philosophical ideas into plain English. He was the first American to receive the title, which ordinarily takes some 20 years of intensive study. In his case, he was urged by his teacher, the acclaimed monk Khen Rinpoche, to spend time outside the monastery, in the business world. At his teacher’s command, Roach took a job at Andin International Diamond Corporation, buying and selling precious stones. According to a book Roach co-authored with McNally, The Diamond Cutter: The Buddha on Managing Your Business and Your Life, in 15 years he grew the firm from a small-time company to a giant global operation that generated annual revenue in excess of $100 million. The book cites a teaching called “The Diamond Sutra,” in which the Buddha looks at diamonds, with their clarity and strength, as symbolic of the perfection of wisdom. But the diamond industry, particularly during the years Roach was active in it, is one of the dirtiest in the world – fueling wars in Africa and linked to millions of deaths.

During a lecture Roach gave in Phoenix last June, I asked him how he could reconcile his Buddhist ethics with making vast sums of money through violent supply chains. Roach stared at me with moist, sincere- looking eyes and avoided the question. “If your motivation is pure, then you can clean the environment you enter,” he said. “I wanted to work with diamonds. It was a 15-year metaphor, not a desire to make money. I wanted to do good in the world, so I worked in one of the hardest and most unethical environments.” It was the sort of answer that plays well with business clients. Rationalizations like this are not uncommon in industry, but they are for a Buddhist monk. If Roach was unorthodox, he was also indispensable. His business acumen might have been enough for some early critics to look the other way. His share of Andin’s profits was ample enough that he could funnel funds to Sera Mey to establish numerous charitable missions.

His blend of Buddhism and business made him an instant success on the lecture circuit, and even today he is comfortable in boardrooms in Taipei, Geneva, Hamburg and Kiev, lecturing executives on how behaving ethically in business will both make you rich and speeding the path of enlightenment. Ian Thorson had always been attracted to alternative spirituality, and he had a magnetic personality that made it easy for him to win friends. Still, “he was seeking something, and there was an element of that asceticism that existed long before he took to any formal practice of meditation, yoga and whatnot,” explains Mike Oristian, a friend of his from Stanford University. Oristian recounts in an email the story of a trip Thorson took to Indonesia, where he hoped a sacred cow might lick his eyes and cure his poor eyesight. It didn’t work, and Thorson later admitted to Oristian that “it was a long way to go only to have the feeling of sandpaper on his eyes.”

Roach gave Thorson a structure to his passion and a systematic way to think about his spiritual quest. After Thorson began studying Roach’s teachings in 1997, Oristian remembers, some of his spontaneous spark seemed to fade. Kay Thorson, Ian’s mother, had a different perspective. She suspected he had fallen under the sway of a cult and hired two anti-cult counselors to stage an intervention. In June 2000 they lured him to a house in Long Island and tried to get him to leave the group. “He was skinny, almost anorexic,” she says. They tried to show him he had options other than following Roach. For a time it seemed to work. Afterward he wrote to a friend about his family’s attempt to deprogram him: “It’s so weird that my mom thinks I’m in a cult and so does Dad and so does my sister. They talk to me in soft voices, like a mental patient, and tell me that the people aren’t ill-intentioned, just misguided.” For almost five years he traveled through Europe, working as a translator and tutor, but he never completely severed ties. Eventually he made his way back to Roach’s fold. In 1996, when she was only two years out of New York University, Christie McNally dropped any plans she’d had to pursue an independent career and became Roach’s personal attendant, spending every day with him and organizing his increasingly busy travel schedule. And though his growing base of followers didn’t know it, she would soon be sharing Roach’s bed.

The couple married in a secret ceremony in Little Compton, Rhode Island in 1998. As had many charismatic teachers before him, Roach established a dedicated following. As it grew he planned an audacious feat that would take him out of the public eye and at the same time establish him in a lineage of high Himalayan masters. He announced that, from 2000 to 2003, he would put his lecturing career on hold and attempt enlightenment by going on a three-year meditation retreat along with five chosen students, among them Christie McNally. In many ways, Roach’s silence was more powerful than his words. Three years, three months and three days went by, and Roach’s reputation grew. Word of mouth about his feat helped expand the patronage of Diamond Mountain and the Asian Classics Institute, which distributed his teachings through audio recordings and online courses.

Every six months he emerged to teach breathless crowds about his meditating experiences. At those events he was blindfolded but spoke eloquently on the nature of emptiness. Finally, on 16 January 2003 he dropped two bombshells in a poem he addressed to the Dalai Lama and published in an open letter. In his first revelation he claimed that after intensive study of tantric practices he had seen emptiness directly and was on the path to becoming a bodhisattva, a sort of Tibetan angel. The word tantra derives from Sanskrit and indicates secret ritualized teachings that can be a shortcut to advanced spiritual powers. The second revelation was that while in seclusion he had discovered that his student Christie McNally was an incarnation of Vajrayogini, the Tibetan diamond-like deity, and that he had taken her as his spiritual consort and wife. They had taken vows never to be more than 15 feet from each other for the rest of their lives and even to eat off the same plate. In light of her scant qualifications as a scholar, Roach legitimized McNally by bestowing her with the title of “lama,” a designation for a teacher of Tibetan Buddhism.

These revelations severely split the Tibetan Buddhist community. The reprimands were swift and forceful. Several respected lamas demanded that he hand back his monk’s robes. Others, including Lama Zopa Rinpoche, who heads the Foundation for the Preservation of the Mahayana Tradition, a large and wealthy group of Tibetan Buddhists, advised that he prove his claims by publicly showing the miraculous powers that are said to come with enlightenment – or be declared a heretic. That Zopa Rinpoche was one of Roach’s greatest mentors made the criticism all the more pertinent and scathing. Robert Thurman, a professor of religious studies at Columbia University, met with Roach and McNally shortly after Roach published his open letter. He was concerned that Roach had broken his vows and that his continuing as a monk could damage the reputation of the larger Tibetan Buddhist community. “I told him, ‘You can’t be a monk and have a girlfriend; you have clearly given up your vow,’” Thurman says. “To which he responded that he had never had genital contact with a human female. So I turned to her and asked if she was human or not. She said right away, ‘He said it. I didn’t.’ There was a pregnant pause, and then she said, ‘But can’t he do whatever he wants, since he has directly realized emptiness?’” On the phone I can hear Thurman consider his words and sigh. “It seemed like they had already descended into psychosis.”

Link: Learning How to Die in the Anthropocene

… The challenge the Anthropocene poses is a challenge not just to national security, to food and energy markets, or to our “way of life” — though these challenges are all real, profound, and inescapable. The greatest challenge the Anthropocene poses may be to our sense of what it means to be human. Within 100 years — within three to five generations — we will face average temperatures 7 degrees Fahrenheit higher than today, rising seas at least three to 10 feet higher, and worldwide shifts in crop belts, growing seasons and population centers. Within a thousand years, unless we stop emitting greenhouse gases wholesale right now, humans will be living in a climate the Earth hasn’t seen since the Pliocene, three million years ago, when oceans were 75 feet higher than they are today. We face the imminent collapse of the agricultural, shipping and energy networks upon which the global economy depends, a large-scale die-off in the biosphere that’s already well on its way, and our own possible extinction. If homo sapiens (or some genetically modified variant) survives the next millenniums, it will be survival in a world unrecognizably different from the one we have inhabited.

Geological time scales, civilizational collapse and species extinction give rise to profound problems that humanities scholars and academic philosophers, with their taste for fine-grained analysis, esoteric debates and archival marginalia, might seem remarkably ill suited to address. After all, how will thinking about Kant help us trap carbon dioxide? Can arguments between object-oriented ontology and historical materialism protect honeybees from colony collapse disorder? Are ancient Greek philosophers, medieval theologians, and contemporary metaphysicians going to keep Bangladesh from being inundated by rising oceans?

Of course not. But the biggest problems the Anthropocene poses are precisely those that have always been at the root of humanistic and philosophical questioning: “What does it mean to be human?” and “What does it mean to live?” In the epoch of the Anthropocene, the question of individual mortality — “What does my life mean in the face of death?” — is universalized and framed in scales that boggle the imagination. What does human existence mean against 100,000 years of climate change? What does one life mean in the face of species death or the collapse of global civilization? How do we make meaningful choices in the shadow of our inevitable end?

These questions have no logical or empirical answers. They are philosophical problems par excellence. Many thinkers, including Cicero, Montaigne, Karl Jaspers, and The Stone’s own Simon Critchley, have argued that studying philosophy is learning how to die. If that’s true, then we have entered humanity’s most philosophical age — for this is precisely the problem of the Anthropocene. The rub is that now we have to learn how to die not as individuals, but as a civilization.

Learning how to die isn’t easy. In Iraq, at the beginning, I was terrified by the idea. Baghdad seemed incredibly dangerous, even though statistically I was pretty safe. We got shot at and mortared, and I.E.D.’s laced every highway, but I had good armor, we had a great medic, and we were part of the most powerful military the world had ever seen. The odds were good I would come home. Maybe wounded, but probably alive. Every day I went out on mission, though, I looked down the barrel of the future and saw a dark, empty hole.

“For the soldier death is the future, the future his profession assigns him,” wrote  Simone Weil in her remarkable meditation on war, “The Iliad or the Poem of Force.” “Yet the idea of man’s having death for a future is abhorrent to nature. Once the experience of war makes visible the possibility of death that lies locked up in each moment, our thoughts cannot travel from one day to the next without meeting death’s face.” That was the face I saw in the mirror, and its gaze nearly paralyzed me.

I found my way forward through an 18th-century Samurai manual, Yamamoto Tsunetomo’s “Hagakure,” which commanded: “Meditation on inevitable death should be performed daily.” Instead of fearing my end, I owned it. Every morning, after doing maintenance on my Humvee, I’d imagine getting blown up by an I.E.D., shot by a sniper, burned to death, run over by a tank, torn apart by dogs, captured and beheaded, and succumbing to dysentery. Then, before we rolled out through the gate, I’d tell myself that I didn’t need to worry, because I was already dead. The only thing that mattered was that I did my best to make sure everyone else came back alive. “If by setting one’s heart right every morning and evening, one is able to live as though his body were already dead,” wrote Tsunetomo, “he gains freedom in the Way.”

I got through my tour in Iraq one day at a time, meditating each morning on my inevitable end. When I left Iraq and came back stateside, I thought I’d left that future behind. Then I saw it come home in the chaos that was unleashed after Katrina hit New Orleans. And then I saw it again when Sandy battered New York and New Jersey: Government agencies failed to move quickly enough, andvolunteer groups like Team Rubicon had to step in to manage disaster relief.

Now, when I look into our future — into the Anthropocene — I see water rising up to wash out lower Manhattan. I see food riots, hurricanes, and climate refugees. I see 82nd Airborne soldiers shooting looters. I see grid failure, wrecked harbors, Fukushima waste, and plagues. I see Baghdad. I see the Rockaways. I see a strange, precarious world.

Our new home.

The human psyche naturally rebels against the idea of its end. Likewise, civilizations have throughout history marched blindly toward disaster, because humans are wired to believe that tomorrow will be much like today — it is unnatural for us to think that this way of life, this present moment, this order of things is not stable and permanent. Across the world today, our actions testify to our belief that we can go on like this forever, burning oil, poisoning the seas, killing off other species, pumping carbon into the air, ignoring the ominous silence of our coal mine canaries in favor of the unending robotic tweets of our new digital imaginarium. Yet the reality of global climate change is going to keep intruding on our fantasies of perpetual growth, permanent innovation and endless energy, just as the reality of mortality shocks our casual faith in permanence.

The biggest problem climate change poses isn’t how the Department of Defense should plan for resource wars, or how we should put up sea walls to protect Alphabet City, or when we should evacuate Hoboken. It won’t be addressed by buying a Prius, signing a treaty, or turning off the air-conditioning. The biggest problem we face is a philosophical one: understanding that this civilization isalready dead. The sooner we confront this problem, and the sooner we realize there’s nothing we can do to save ourselves, the sooner we can get down to the hard work of adapting, with mortal humility, to our new reality.

The choice is a clear one. We can continue acting as if tomorrow will be just like yesterday, growing less and less prepared for each new disaster as it comes, and more and more desperately invested in a life we can’t sustain. Or we can learn to see each day as the death of what came before, freeing ourselves to deal with whatever problems the present offers without attachment or fear.

If we want to learn to live in the Anthropocene, we must first learn how to die.

Link: A Mysterious Death at the South Pole

Fifty people. The most remote base on the planet. No way in or out for eight months. Then one of them dies under curious circumstances. A new look into one of Antarctica’s most enduring enigmas.

During the 24 hours that Rodney Marks’s life was slipping away from him, he had plenty of time to contemplate his predicament. He knew he was trapped, cut off from adequate medical attention, about as far from civilization as one can get on this planet. He knew that during the long, dark winters at the South Pole — where for eight months of the year it’s too cold to land a plane – small problems become big ones very fast.

As the 32-year-old Australian astrophysicist lay on the old navy gurney in the biomed facility of the Amundsen-Scott base, Marks may have been thinking about the Russian doctor who had to give himself an appendectomy during a South Pole “winterover” in 1961, or of Dr. Jerri Nielsen, who in 1999 diagnosed and treated her own breast cancer with supplies dropped in by parachute. But unlike them, neither Marks nor the base’s lone physician had any idea what was wrong with him. He had woken up at 5:30 that morning vomiting blood, and the burn that had started in the pit of his stomach was now radiating throughout his body.

It was already Marks’s second visit to the makeshift hospital that day, and he arrived scared, anxious, and wearing sunglasses to protect his unbearably sensitive eyes. There was no one medical condition that the base physician, Dr. Robert Thompson, could think of that would explain what was happening to Marks. The doctor’s only link to the outside world was an internet connection and a satellite phone, and both were down at the time – the base’s position at the bottom of the planet meant it lost its signal for much of each day. The doctor spent hours clutching for a diagnosis, at one point grabbing hold of alcohol withdrawal and even anxiety as possibilities.

Thompson injected Marks with a sedative, which calmed him enough that he decided to return to his own bed and rest for a while. He lay beside his girlfriend, Sonja, sleepless and afraid, listening to the shifting ice groan beneath him. Then he retched again. More blood. His breathing was now uncontrollably fast. Pain throbbed in his joints, and he began to panic. He made his way back to Biomed, this time stumbling through the dimly lit tunnels, disoriented, as if in fast motion.

By the time he arrived, he was hyperventilating and combative. Thompson gave him another injection – this time Haldol, a powerful antipsychotic – just to regain control of him. As it took effect, Marks lay down again, but this time he began to lose consciousness. He moaned quietly with each exhale and squeezed Sonja’s hand lightly. Then his heart stopped.

A stationwide alarm summoned the trauma team, a few trained volunteers whose real jobs could be anything from scientist to mechanic. Darryn Schneider, a fellow physicist and the only other Australian at the base, was the first to arrive. He took over for Sonja, holding the ventilator mask over his good friend’s nose and mouth, desperately pumping air into Marks’s lungs.

Then, just before six in the evening, as the trauma team scrambled to save him and the rest of the 50-member crew were sitting down to dinner, Marks took a deep, sighing breath into his chest – it was his last. It was May 12, 2000, a full five months before a plane would be able to retrieve his body.

Once it was finally flown to Christchurch, New Zealand, that October, a startling discovery would be made, one that would set off an eight-year investigation and a bitter tug-of-war between a New Zealand detective and the National Science Foundation, which administers all U.S.-based research at the South Pole. The search for answers as to what killed Rodney Marks would also open a window into the highly peculiar, sometimes dysfunctional, community of people that operates in isolation there for eight months at a time. Ultimately, the NSF would make sweeping changes in how things are run at the South Pole and who it sends there.

At the time of Marks’s death, though, there was little reason to anticipate such far-reaching ramifications. The rest of the crew assumed he had suffered a heart attack or aneurysm. The NSF itself even issued a statement within hours, saying he “apparently died of natural causes.” But there was nothing natural about the way Rodney Marks died.

Antarctica belongs to no one. Seven countries officially have territorial claims on the continent, but the U.S. has never recognized any of them. Supported by a 1959 treaty of cooperation, 29 countries have set up scientific research stations there, and an ever-changing population of up to 4,500 scientists and support staff from all corners of the globe call it home for anywhere from four days to 14 months at a time.

Nearly all who come to work in Antarctica will first touch down in McMurdo, the continent’s only working township. Resembling a small town in arctic Alaska, it sits at the edge of the ice, where it meets the Southern Ocean. Getting off the plane in Mac Town for the first time is a startling experience. The eight-hour flight from New Zealand aboard one of the cavernous military cargo planes leaves ears ringing and backsides numb. After landing, sensory overload gives way to the blinding absence of color and a Hoth-like landscape: a smoldering volcano in one direction, theRoyal Society range and Mount Discovery across McMurdo Sound, ice and snow everywhere.

Nearly a thousand miles from McMurdo, at 90 degrees south, just 100 yards or so from the always slightly moving geographic pole marker, sits the Amundsen-Scott research station, the loneliest habitation on Earth. Named for the first two explorers to reach the South Pole – separately in 1911 and 1912 – the American base is run by the National Science Foundation. In the mid-’50s, the intensifying Cold War goaded the United States into establishing a presence on the continent, so the navy announced it would build and man a permanent base at the South Pole. It launched Operation Deep Freeze in 1955, primarily as a research endeavor. The Dome, in which Marks lived, replaced the original station in 1975. It comprises three separate two-story structures that sit beneath an 18,000-square-foot, 50-foot-high geodesic shell, which acts as a giant windbreak, sheltering the living quarters from the deadly sting of the elements. The buildings themselves look like red portable sheds stacked on top of one another, each with a thick walk-in-freezer-style door.

Amundsen-Scott is populated year-round by scientists – most working for American universities and studying the atmosphere, astronomy, or seismology – and a support staff that includes everyone from cooks to carpenters. Nearly 250 people are based there in the summer, but the population shrinks to just a quarter of that for the austral winter: February through October.

The first week of February is frenzied as the remaining summer crew clears out and the winter crew receives its vital resupplies. The real cold arrives in March, and the base becomes a very different place: Soon the sun no longer makes it above the horizon, and it becomes so cold (temperatures regularly hit minus-80) that a plane’s hydraulic fluids would freeze solid within minutes of touching down. After the last plane leaves, there’s no way in or out for eight months, and the continent goes dark and quiet, just the way a winter Polie likes it.

Understanding what type of person would volunteer to work at the South Pole during the winter is something that has intrigued everyone from social scientists to NASA. The physical screening is rigorous – it’s often said that everyone handed a winter contract has perfect wisdom teeth, and some bases won’t even consider you if you have an appendix – but psychological screening is far less straightforward. Through a series of tests and interviews, the NSF tries to hire people with a rare and delicate balance of good social skills and an antisocial disposition – basically, loners with very long fuses.

Some of the first behavioral studies on the South Pole winterover were launched after the sudden onset of schizophrenia in a construction worker in 1957. He had to be sedated and quarantined for almost an entire winter. Lore has it he was put in an improvised mental ward – a specially built room padded with mattresses. Because incidents like these can spiral out of control quickly this far from civilization, putting entire crews at risk, NASA saw a South Pole winter deployment as an interesting analogue to long stays in space.

"We’re social animals," says Lawrence Palinkas, professor of social policy and health at the University of Southern California and the author of several behavioral studies on social dynamics in Antarctica on behalf of NASA. "The separation from friends and family is stressful. But the lack of stimulation – of new scenery, new faces – actually causes people to have difficulty with cognitive thought. Even in well-adjusted groups, we estimate between 3 and 5 percent will experience some form of psychological problem – sleep disorders, depression, alcohol addiction."

It’s this ability, even willingness, to live in such extreme conditions for such an extended period of time that sets winter Polies apart. They have an odd sense of adventure and actually seem drawn to the isolation and risk. “These are people who thrive on being the last cog,” says Harry Mahar, health and safety officer for the NSF’s polar program from 1992 to 2004. The power plant technicians, for instance, “are the type of people who, in their off year, would run DEW line sites [for distant early warning of missiles] up in the Arctic or power plants in the middle of the Pacific, and they’re damn good mechanics.” That’s a good thing: If the generators at the South Pole go down and can’t be fixed, the crew probably won’t survive.

Link: Mr. Misery

In the December 2004 issue of SPIN, we published Los Angeles journalist/musician Liam Gowing’s detailed, empathetic look at the last years of Elliott Smith’s life and the circumstances that led up to the Grammy-nominated singer-songwriter’s apparent suicide. “Mr. Misery” was difficult to read, a tremendous challenge to edit and fact-check, and one of the most remarkably intimate pieces in the magazine’s history. On the 10th anniversary of Smith’s death, it’s now available for the first time on the site.

Haunted by troubling memories, he spent the last years of his life trying to beat a debilitating drug addiction and pouring out his heart to anyone who would listen. The people who knew him in those years open up about the demons that he battled to the end.

Things did not look good for Elliott Smith in August 2001. If you were in the crowd the night that the acclaimed singer/songwriter headlined Los Angeles’ Sunset Junction Street Fair and didn’t know any better, you might have thought he was an indigent blind man who had wandered up the steps to the stage. He was pale and thin and so stooped over, it looked as though he’d just landed on some distant planet where the gravity was so intense that it required a Herculean effort to simply stand erect. As he sat down and cradled his guitar in his lap, Smith raised his right hand to strike the strings, then dropped it onto the instrument as if he had, at that very moment, fallen asleep.

"I’m sorry," he called out after train-wrecking most of the first half of his set. "I can’t remember the words. I’m so fucked-up."

The scene that warm summer evening was not an unusual one. The entire year had been a train wreck for Smith. The previous December, after returning — strung-out on heroin — from a tour supporting his Figure 8 album, he had abandoned plans to record a follow-up to the 2000 release with longtime producer Rob Schnapf. He’d begun distancing himself from Schnapf’s wife, Margaret Mittleman, his manager since 1994. And although he’d started recording again with Aimee Mann producer Jon Brion (who played on Smith’s 1998 album XO), those sessions had ground to a halt. Several weeks of labor produced reels of false starts and Smith repeatedly saying, “That sucked.”

After Brion submitted a bill for the fruitless sessions to DreamWorks Records (an amount recoupable from Smith’s account), executives Lenny Waronker and Luke Wood called a meeting to figure out what had gone wrong. Long unhappy with the major-label world, where record-company expenditures offset his typically moderate sales, Smith informed Wood and Waronker that their arrangement was unworkable and that the label’s intrusions into his privacy were unacceptable. “Elliott was disappointed with what the record company didn’t do,” says Mittleman. “He felt that they gave up on Figure 8 early. And he was dealing with the promotion department, interviews, radio-stationvisits, people he couldn’t relate to: disc jockeys, club promoters. It was hard. I’m not putting all the blame on DreamWorks. They just couldn’t get the record on the radio.”

"[Elliott] was always ahead of the market," says Wood. "Eighteen months later, the market would catch up. He was the John Lennon and Bob Dylan of my generation, but unlike those two, he was fighting a cultural tidal wave that always seemed to be going in the opposite direction."

Smith, who by then had progressed from heroin to crack, was not interested in discussing market trends or corporate finance. He demanded to be released from the label; and then, in a message relayed by his lawyer to Waronker and Wood, Smith said that if they refused to break his contract, he would opt out of his obligations to DreamWorks by taking his own life. At Smith’s home in Los Angeles’ Los Feliz neighborhood, above a floor littered with crack pipes and heroin-scalded tinfoil, he had hung a noose, just in case.

But Smith did not commit suicide while in the throes of addiction. Instead, in the months that followed, he threw himself into recording with renewed vigor, first at a friend’s home studio and later at his own New Monkey Studio in nearby Van Nuys. And after successfully kicking his addictions to both heroin and crack in fall 2002, he began playing shows and discussing plans to assemble 30 of his new songs into a double album; he wanted to pour the profits from its sale into the Elliott Smith Foundation, which he had established with drug counselor Jerry Schoenkopf and then-girlfriend Valerie Deerin to benefit abused children. After an optimistic birthday celebration in August 2003, Smith even got sober, giving up alcohol as well as red meat, refined sugar, and caffeine. He also began to phase out most of his prescription medications.

Then, on October 21, 2003, everything fell apart again. After a frantic 911 call from his girlfriend, Jennifer Chiba, an ambulance was dispatched to her home in Echo Park, where Smith lay bleeding to death from two stab wounds to the chest.

Was it a suicide? A murder? A freak accident? Nobody seemed to know what had happened. Then came the bombshell. On January 6, the Los Angeles County Department of the Coroner completed its report on Smith’s death. First, toxicology tests confirmed that Smith, widely assumed to be using street drugs again, was clean at the time of his death; all prescribed medications present in his system were at ”therapeutic or sub-therapeutic” levels. In her report, deputy medical examiner Lisa Scheinen concluded: “While his history of depression is compatible with suicide, and the location and direction of the stab wounds are consistent with self-infliction, several aspects of the circumstances (as are known at this time) are atypical of suicide and raise the possibility of homicide,” including “stabbing through clothing,” the presence of “incisive wounds…possible defensive wounds” on one arm and one hand, and an unusual “absence of hesitation wounds” around the fatal injury. The report added, “The girlfriend’s reported removal of the knife and subsequent refusal to speak with detectives are all of concern.”

Breaking the news to much of the world, LA Weekly writer Christine Pelisek related the full details of the coroner’s report on the paper’s website on January 7. Pelisek even reported that Smith and Chiba had argued just minutes before she called 911. Several of Smith’s friends were quick to support Chiba. Sound engineer Fritz Michaud addressed the clothing anomaly, saying that “Elliott literally wouldn’t have been caught dead with his shirt off”; and close friend Robin Peringer dismissed the “possible defensive wounds,” explaining that Smith was a “cutter” (a person with an emotional condition that causes them to cut their bodies with knives, razor blades, etc.). But in many courts of public opinion across the country, Jennifer Chiba was Professor Plum in the library with the candlestick.

And then, nothing. No new information. No arrests. “The case is still an open investigation,” says LAPD Detective James King.

Now, after a year of self-imposed silence, Elliott Smith’s family has released From a Basement on the Hill (through Anti- Records, a subsidiary of punk label Epitaph), a stripped-down, single-disc collection of Smith’s final recordings, which, in its present form, offers very little help in answering the lingering questions that surround the artist’s demise.

Did Elliott Smith commit suicide? And if so, why? Many of Smith’s closest friends at the time of his death say yes, and suggest that his depression, alienation, self-loathing, and drug use were merely symptoms of an underlying trauma. To this inner circle, the fact that Smith died sober was no surprise, because as their testimonials suggest, Smith was not suffering from a drug problem — he was searching for a drug solution.

The drive up from the Pacific Coast Highway to David McConnell’s Malibu home studio is one of those journeys that makes the exorbitantly expensive real estate, the earthquakes, and the landslides, all seem like small prices to pay for the glories of living in California. After cresting one particuarly scenic overlook, I spontaneously gasp, “Oh my God.” Here, before an expanse of ocean so massive that the curvature of the earth starts to reveal itself, Smith began again to record what he intended to be his magnum opus, the proposed double album.

He was in bad shape when he arrived here in May 2001. In addition to drinking heavily, Smith had been smoking up to $1,500 worth of heroin and crack per day, as well as ingesting potentially deadly amounts of prescription tranquilizers. “I learned really fast that there was no way you could intervene in his drug habit; he would have killed himself before he let anyone intervene,” McConnell says. “So I had him on constant suicide watch. He tried OD’ing. He would say things like, ‘The other day I popped 15 Klonopin, thinking it would help me die, and it didn’t.’ It didn’t work! The guy was immune to drugs. I’ve never seen anything like it in my life, where somebody could take that many drugs and walk away. He used to talk about it: ’Fuck, man! I just did $800 worth of drugs in an hour! What’s wrong? What the fuck!’”

But the regimen of tranquilizers and narcotics was something fairly new to Smith. Though he had been inextricably linked with heroin since he used the drug as a lyrical metaphor for all manner of dependency on his self-titled second album (released in 1995), he later dismissed rumors that he’d been a junkie before moving to Los Angeles in 2000. But if his method of execution was new, Smith’s appetite for self-destruction was certainly not.

"Elliott told me about having a psychotic episode while he was [recording Figure 8],” says McConnell, a producer who has recorded the Los Angeles bands Goldenboy and Alaska!, as well as Josie Cotton, singer of the ’80s novelty hit “Johnny, Are You Queer?” “He was fed up with the current state of his life. A lot of people from the label were telling him he needed to get it together. He was so sick of people talking about the future. So he carved the word ‘now’ into his arm with a knife. And he sat down at the piano and wrote ‘Everything Means Nothing to Me’ as the blood was dripping down his arm.”

Link: How I'm Going to Commit Suicide

A shockingly honest (and beautifully elegant) confession by Britain’s most celebrated art critic, Brian Sewell.

Every night I swallow a handful of pills. In the morning and during the day I swallow others,  haphazardly, for I am not always in the right place at the right time, but at night there is a ritual.

I undress. I clean my teeth. I wipe the mirror clear of splashes and see with some distaste the reflection of my decaying body, wondering that it ever had the impertinence to indulge in the pleasures of the flesh.

And then I take the pills. Some are for a heart that too often makes me feel that I have a misfiring single-cylinder diesel engine in my rib-cage.

Others are for the ordinary afflictions of age and still others ease the aches of old bones that creak and crunch. All in their way are poisons – that they do no harm is only a matter of dosage.

I intend, one day, to take an overdose. Not yet, for the experts at that friendly and understanding hospital, the Brompton in Kensington, manage my heart condition very well.

But the bone-rot will reach a point – not beyond endurance but beyond my willingness to endure it – when drugs prescribed to numb the pain so affect the functions of my brain that all the pleasures of music, art and books are dulled, and I merely exist.

An old buffer in a chair, sleeping and waking, sleeping and waking.

The thought of suicide is a great comfort, for it is what I shall employ if mere existence is ever all that I have. The difficulty will be that I must have the wit to identify the time, the weeks, the days, even  the critical moment (for it will not be long) between my recognising the need to end my life and the loss of my physical ability to carry out the plan.

There is a plan. I know exactly what I want to do and where I want to do it – not at home, not in my own bed. I shall write a note addressed ‘To whom it may concern’ explaining that I am committing suicide, that I am in sound mind, that no one else has been involved and, if I am discovered before my heart has stopped, I do not want to be resuscitated.

With this note in my pocket, I shall leave the house and totter off to a bench – foolishly installed by the local authority on a road so heavy with traffic that no one ever sits there – make myself comfortable and down as many pills as I can with a bottle of Bombay Gin, the only spirit that I like, to send them on their way.

With luck, no one will notice me for hours – and if they do, will think me an old drunk. Some unfortunate athlete will find me, stiff with rigor, on his morning jog.

I have left my cadaver to a teaching hospital for the use and abuse of medical students – and my sole misgiving is that, having filled it with poisons, I may have rendered it useless.

There are those who damn the suicide for invading the prerogative of the Almighty. Many years, however, have passed since I abandoned the beliefs, observances and irrational prejudices of Christianity, and I have no moral or religious inhibitions against suicide.

I cherish the notion of dying easily and with my wits about me. I am 82 tomorrow and do not want to die a dribbling dotard waiting for the Queen’s congratulatory greeting in 2031.

Nor do I wish to cling to an increasingly wretched life made unconscionable misery by acute or chronic pain and the humiliations of nursing.

What virtue can there be in suffering, in impotent wretchedness, in the bedpans and pisspots, the feeding with a spoon, the baby talk, the dwindling mind and the senses slipping in and out of consciousness?

For those so affected, dying is a prolonged and degrading misadventure. ‘We can ease the pain,’ says another of this interregnum between life and death. But what of those who want to hurry on?

Then the theologian argues that a man must not play God and determine his own end and prates of the purification of the soul through suffering and pain.

But what if the dying man is atheist or agnostic or has lost his faith – must he suffer life longer because of the prejudice of a Christian theologian? And has it occurred to no theologian that God himself might inspire the thought of suicide – or is that too great a heresy?

Link: Russians Who Raised the Dead

In the years before World War II, Russian scientists attempted to revive fish and dog heads, and even a human being. Excerpted from “How to Make a Zombie.”

[Sergei] Bryukhonenko…graduated from Moscow University Medical School in 1914, just in time to be drafted into the Imperial Russian Army and bear witness to the horrors of the First World War. After the Russian revolution, he worked for several years in a large hospital, before turning to his famous experiments. At the time, the field of physiology was maturing rapidly, and Bryukhonenko decided to study the intricate workings of the organs. To do so, it was necessary to keep individual organs functioning once they had been removed from their host. In a cramped and underequipped laboratory he set himself to the task of keeping organs alive.

In May 1925, at the meeting of the Second Congress of Russian Pathologists, Bryukhonenko demonstrated the fruits of three years’ labour in the lab: the original heart-lung machine that he had built for his dogs’ heads. Using two electric pumps, the primitive life-support system drew exhausted blood from the head and deposited it in a glass chamber where it was warmed and oxygenated, then pumped back into the animal. In these early days, this “autojektor” was not hermetically sealed, and eventually the blood supply would coagulate and the system would fail. Nevertheless, Bryukhonenko could keep a dog’s head alive for about one hundred minutes. His results were met with little fanfare, however, and failed to provoke any mention in the popular press. The following year he again demonstrated the autojektor, outlining the progress he and his colleague Sergei Chechulin had made in prolonging the lifespan of their test subjects. Again, there was no coverage.

Six months later the Soviet media finally broke the silence surrounding the device, and once they did the story gathered an unstoppable momentum. Prosaic technicians mused about how it might mean that surgeons would be able to repair a diseased heart while the machine was used to keep the patient alive; the more fanciful dreamers envisioned the birth of a fullthrottled immortality engine in Bryukhonenko’s lab. Public dismay mounted over the conditions under which Bryukhonenko had been forced to concoct his life-support system, and the director of the Chemical-Pharmaceutical Institute was compelled to increase the provision for Bryukhonenko’s research to thirty thousand roubles. The grant came from the People’s Commissariat for the Protection of Health, the highest agency responsible for medical research in the USSR.

With this funding, over the next year Bryukhonenko was able to produce five papers on various aspects of autojektor experiments. He presented these at the Congress of Soviet Physiologists in 1928 – and this time, with the full backing of the Soviet government, there was no delay in provoking a media sensation. Rumours quickly circulated on American campuses that the communist scientists had succeeded in reanimating the dead. In February 1929, a student paper at the Massachusetts Institute of Technology reported the news that Bryukhonenko and Chechulin had kept a severed dog’s head alive for three and a half hours with “a queer-looking affair made of glass and rubber tubing”. Within the month, Time magazine shared a bulletin: “Vague reports have been reaching the U.S. that Russian scientists have revivified corpses”. On hearing of the invention, the playwright George Bernard Shaw quipped, “I am greatly tempted to have my head cut off so that I may continue to dictate plays and books independently of any illness, without having to dress and undress or eat or do anything at all but to produce masterpieces of dramatic art and literature.”

The ability to sustain an animal using a heart-lung machine allowed for a much more mechanistic view of life. Metaphysical concepts for separating the living and the dead – such as the Catholic soul or the Vodou nanm – were threatened with obsolescence in the face of modern medicine. If the only difference between being alive and being dead was having a heartbeat, then wouldn’t a corpse revived with a machine be alive? And why shouldn’t a machine take the place of a broken heart?

Keeping a head alive was one thing; raising the dead was quite another. Bryukhonenko was not the first Russian to dedicate himself to the problem. As early as February 1902, Aleksei Aleksandrovich Kuliabko of the Physiological Laboratory of the Imperial Academy of Sciences in St Petersburg had restarted a rabbit heart that had stopped beating forty-four hours previous, and went on to repeat this procedure on animal hearts up to five days post-mortem. The next year he procured the heart of a three-month-old infant who had died from pneumonia two days earlier. Using Locke’s solution – a mixture containing sodium chloride, calcium chloride, potassium chloride, sodium bicarbonate and dextrose, and designed by the British physiologist Frank Spiller Locke specifically to keep excised hearts pumping – Kuliabko was able to bring the baby’s heart back to life. In 1907, he developed techniques for artificial circulation that could revive a severed fish head. Between 1910 and 1913, another Russian, Fyodor Andreyev, succeeded in resuscitating an electrocuted dog by injecting a combination of saline and adrenaline into the bloodstream and then applying an electric shock to the heart. Andreyev would later become director of the hospital where Bryukhonenko spent his postwar years, and no doubt encouraged the young doctor to explore their common interest in reanimation.

In 1929, as Bryukhonenko was attaching dogs’ heads to his autojektor, Aleksei Kuliabko set aside his outmoded fish heads and prepared his most ambitious experiment yet: a secret attempt to reanimate a human. He was joined in the experiment by the “chemico-pharmacist” Fyodor Andreyev, several assistants and a man who had passed away during surgery the day before. The team arranged the corpse on an operating table and attached a tangle of pumps to the blood vessels so that they could be pumped full of Locke’s solution and adrenaline. The man’s heart heaved violently in his chest, and a wet choking sound erupted from his throat like a death rattle. Kuliabko’s assistants fled the room in terror. Kuliabko and Andreyev kept the man’s heart beating for twenty minutes before it stopped. When news of Bryukhonenko’s decapitated dogs eventually made the headlines, Andreyev could not resist hinting that the science had already moved on. He told reporters: “The principle has already been demonstrated successfully. It only remains to develop the technique for surgeons to apply practically.”

Perhaps disturbed by the experiment, or wary of the public reaction that might be aroused if word leaked out of a reanimated man, Kuliabko decided to carry out his future trials on dogs, following Bryukhonenko’s lead. One of Kuliabko’s canine subjects showed remarkable resilience: having been poisoned and revived once, it was purportedly poisoned again and left dead for several months, before being successfully revived a second time. But Bryukhonenko had heard about Kuliabko’s experiments with humans and he was ready to try his own hand at them.

He enlisted the help of the experimental surgeon Sergeo I. Spasokukotey, who had helped to engineer the network of blood banks across the Soviet Union. In 1934, showing a similar level of disregard for a person’s self-determination as he had shown for the laws of nature, Bryukhonenko attempted to revive a man who had committed suicide. Just three hours after the man had hung himself, the doctor slit open an artery and a vein and connected them to the autojektor. The machine steadily drew cold dead blood from the corpse and returned it warm and rich with oxygen. For several hours the team waited, listening to the whirr of the autojektor as the dead man’s body slowly warmed. Then a faint sound joined them in the room: a heartbeat.

Link: The Executioners of the Ottoman Empire

The executioners of the Ottoman Empire were never noted for their mercy; just ask the teenage Sultan Osman II, who in May 1622 suffered an excruciating death by “compression of the testicles”–as contemporary chronicles put it–at the hands of an assassin known as Pehlivan the Oil Wrestler. There was reason for this ruthlessness, however; for much of its history (the most successful bit, in fact), the Ottoman dynasty flourished—ruling over modern Turkey, the Balkans and most of North Africa and the Middle East—thanks in part to the staggering violence it meted out to the highest and mightiest members of society.

Seen from this perspective, it might be argued that the Ottomans’ decline set in early in the 17th century, precisely at the point when they abandoned the policy of ritually murdering a significant proportion of the royal family whenever a sultan died, and substituted the Western notion of simply giving the job to the first-born son instead. Before then, Ottoman succession had been governed by the “law of fratricide” drawn up by Mehmed II in the middle of the 15th century. Under the terms of this remarkable piece of legislation, whichever member of the ruling dynasty succeeded in seizing the throne on the death of the old sultan was not merely permitted, but enjoined, to murder all his brothers (together with any inconvenient uncles and cousins) in order to reduce the risk of subsequent rebellion and civil war. Although it was not invariably applied, Mehmed’s law resulted in the deaths of at least 80 members of the House of Osman over a period of 150 years. These victims included all 19 siblings of Sultan Mehmed III—some of whom were still infants at the breast, but all of whom were strangled with silk handkerchiefs immediately after their brother’s accession in 1595.

For all its deficiencies, the law of fratricide ensured that the most ruthless of the available princes generally ascended to the throne. That was more than could be said of its replacement, the policy of locking up unwanted siblings in the kafes(“cage”), a suite of rooms deep within the Topkapi palace in Istanbul. From around 1600, generations of Ottoman royals were kept imprisoned there until they were needed, sometimes several decades later, consoled in the meantime by barren concubines and permitted only a strictly limited range of recreations, the chief of which was macramé. This, the later history of the empire amply demonstrated, was not ideal preparation for the pressures of ruling one of the greatest states the world has ever known.

Link: Todd May on Death & Immortality

Todd May is the Class of 1941 Memorial Professor at Clemson University—a very fancy title for a very non-fancy guy. He is bald, plays basketball, has a wife and two kids, and kind of looks like Michel Foucault (which is weird, because Todd has written a book about him). He’s written nine other books, too—including a volume about poststructuralist anarchism and another about friendship under neoliberalism—but with Todd, talking about his resume somehow feels beside the point.

The first time I met Todd was at Nội Bài Airport in Hanoi. I was living there, and Todd had decided to fly over for a visit. My best friend Dan (one of Todd’s students) had put us in touch, and we spent a couple of days riding around on motorbikes and talking about whatever came to mind.

Looking back, I’m amazed at how patient Todd was. He treated me like an equal, never pulling rank or bringing the philosophical hammer down, and at some point it became clear how little stock he put in his own credentials. We became friends—just two curious people trying to figure out what was going on in this life.

Todd’s books read the way he talks—simply and clearly, without pretense. The first one I read—Our Practices, Our Selves: Or, What It Means to be Human—is maybe the humblest treatment of a big existential question that I’ve ever seen from a professional philosopher. It’s just so obvious: Todd doesn’t write to look cool or show you how much he knows. He writes because he’s been thinking about some interesting things, and he wants to share them with you, and maybe you can relate.

Todd’s book about death (the subject of this interview) feels the same way. He’s taken on biggest and scariest topic there is, but you wouldn’t know it from his tone. There’s a lightness, a sense that whatever we learn by thinking honestly and clearly about dying, it’ll somehow be OK. After all, here we are, talking together.

This interview took place over Skype. I was in Cambridge, Massachusetts, and Todd was in the rec room of his house in Clemson. The room reminded me of my friends’ basements growing up—there were wood-paneled walls, gym equipment strewn about, and a general sense that the rules didn’t quite apply.

THE BELIEVER: I finished the book this morning. About halfway through, I began thinking about Indiana Jones and The Last Crusade. Have you seen it?

TODD MAY: I don’t know if I’ve seen The Last Crusade. I’ve seen several of them.

BLVR: It’s the one where they’re going after the Holy Grail, and it’s a race between Indy and the Nazis.

TM: Yes, I have seen that movie.

BLVR: OK. So, at the very end, they find the Holy Grail, but the Nazis shoot Indy’s dad—Sean Connery—and he’s dying. Indy saves his dad by giving him a sip of water from the Holy Grail (which, as we know, provides everlasting life). Indy then takes a sip himself—for some reason, he doesn’t offer any to his two friends—and then they vanquish the Nazis and ride off into the sunset.

Now, the movie makes all this out to be great, but I remember watching it and feeling really unsettled. True, Indy has saved his dad’s life, but he’s also consigned his dad to living forever. Everyone else around them is going to die at some point, but the two of them will live on in perpetuity.

My sense is that you might actually think Indy made the wrong call—that he did his dad a disservice by giving him everlasting life. Let’s start there.

TM: I think it’s actually more complicated than that. Indy can let his dad die, and that was probably a really bad time for him to die. Or he could extend his life indefinitely, which in the end probably wouldn’t be such a good thing either. So, the paradox I really wanted to press in the book is that neither of these options—dying or being immortal—is a good option.

BLVR: But you ultimately do settle on the side of death, no? You compare death to “a disease whose cure, if it existed, would be worse than the disease itself.” You also write that the things that make our lives distinctive and meaningfully human would fade or would have to be “reconfigured” if we were immortal. In other words, you’re ultimately glad that we have to die, even if you don’t actually look forward to your own death.

TM: Right. I think that’s fair enough. We have to add to that the idea that…well, that life is short, and if death were to be a good thing, it would be a better thing much further down the road than it is for human lives now. All of this, by the way, raises some interesting questions, which I tried to deal with a little bit in the book, in terms of whether I’m the only one who would be immortal or whether everyone would be immortal.

I did an interview with a filmmaker yesterday, and we were talking about this. And he said that he would like to be immortal but would like to be the only one. He said that way, he could see life changing around him enough so that he might not get bored.

BLVR: Someone said something similar to me the other day. And my first thought was, I can’t imagine anything lonelier than knowing that everyone around me will die one day. In fact, the first thing I imagined was jealousy—jealousy of the solidarity and bonds that arise among people who have to live in the face of death, knowing that I’d be on the outside of that.

TM: That’s a very powerful thing you said, and I don’t think I’d thought of that until you just said this now. But I think that’s right—it’s a powerful bond that keeps us together.

There are certain things we can be that are meaningful to us, and other things that we cannot be. If we’re immortal but no one around us is, the same question arises—whether one is simply doing the same thing that one does, just with other people. It becomes like telling a story. You know how you tell a story, and it seems like an interesting story the first bunch of people you tell it to. But at some point in telling that story, if you’ve told it twenty or thirty times, it feels a little… you feel disconnected from the story. I would think that that would happen as well.

BLVR: In my mind, one of the features that makes us who we are is our ethical impulse, our desire to know out how to live well. You say that under conditions of immortality, “Even justice would be imperiled.  The needs of others would not urge themselves on us in the same way, since their existence would not be threatened by our neglect.”

Obviously it’s true that if we can’t die, we needn’t worry about preventing other people’s deaths. But surely people could still suffer, and I’m wondering whether you think that under conditions of immortality, we would be any less concerned by that.

TM: If I remember Borges’ story The Immortal correctly, there is a point where one of the immortals falls into a ravine or something like that and is left there—

BLVR: For decades.

TM: Yeah, for decades. And they said, “Look, we’ll get him, but surely there’s no rush.”

I’m of two minds about that moment. On the one hand, it seems callous in a way that I don’t think one’s immortality would necessarily bequeath. Because if you see somebody suffering, that’s surely going to be reason to stop, to do something to intervene.

BLVR: Yup.

TM: On the other hand, I could imagine they’re thinking this: Well, we’ll get him out of the ravine, but it’s just going to bring him back into this shapeless life that he’s in now. So, the difference between the suffering in the ravine and the shapelessness of our lives is not so great as to foster an urgency. And I don’t know what I think about that.

In the story, all of the monuments among which the immortals lived were left to erode, because they just didn’t have the meaning that they once had and the immortals said they could always rebuild them back at any time. So, I suspect that was the kind of thought that Borges had in mind when they left the person in the ravine.

Accustom yourself to the belief that death is of no concern to us, since all good and evil lie in sensation and sensation ends with death. Therefore the true belief that death is nothing to us makes a mortal life happy, not by adding to it an infinite time, but by taking away the desire for immortality. For there is no reason why the man who is thoroughly assured that there is nothing to fear in death should find anything to fear in life. So, too, he is foolish who says that he fears death, not because it will be painful when it comes, but because the anticipation of it is painful; for that which is no burden when it is present gives pain to no purpose when it is anticipated. Death, the most dreaded of evils, is therefore of no concern to us; for while we exist death is not present, and when death is present we no longer exist. It is therefore nothing either to the living or to the dead since it is not present to the living, and the dead no longer are.
— Epicurus, Letter to Menoeceus
I believe that when I die I shall rot, and nothing of my ego will survive. I am not young and I love life. But I should scorn to shiver with terror at the thought of annihilation. Happiness is nonetheless true happiness because it must come to an end, nor do thought and love lose their value because they are not everlasting. Many a man has borne himself proudly on the scaffold; surely the same pride should teach us to think truly about man’s place in the world. Even if the open windows of science at first make us shiver after the cosy indoor warmth of traditional humanizing myths, in the end the fresh air brings vigour, and the great spaces have a splendour of their own.
— Bertrand Russell, What I Believe