Sunshine Recorder

Link: Antibiotics, Capitalism and the Failure of the Market

Last March 2013, England’s Chief Medical Officer, Dame Sally Davies gave the stark warning that antimicrobial resistance poses “a catastrophic threat” Unless we act now, she argued, “any one of us could go into hospital in 20 years for minor surgery and die because of an ordinary infection that can’t be treated by antibiotics. And routine operations like hip replacements or organ transplants could be deadly because of the risk of infection.”[1]

Over billions of years, bacteria have encountered a multitude of naturally occurring antibiotics and consequentially developed resistance mechanisms to survive. The primary emergence of resistance is random, coming about by DNA mutation or gene exchange with other bacteria. However, the further use of antibiotics then favours the spread of those bacteria that have become resistant.

More than 70% of pathogenic bacteria that cause healthcare acquired infections are resistant to at least one the drugs most commonly used to treat them.[2][3] Increasing resistance in bacteria like Eschericha coli (E. coli) is a growing public health concern due to the very limited therapy options for infections caused by E. coli. This is particularly so in E .coli that is resistant to carbapenem antibiotics, the drugs of last resort.

The emergence of resistance is complex issue involving inappropriate and over use of antimicrobials in humans and animals. Antibiotics may be administered by health professionals or farmers when they are not required or patients may take only part of a full course of treatment. This provides bacteria the opportunity to encounter the otherwise life-saving drugs, at ineffective levels and survive mutation to produce resistant strains. Once created, these resistant strains have been allowed to spread by poor infection control and regional surveillance procedures.

These two problems are easily solved by educating healthcare professionals, patients and animal keepers about the importance of antibiotic treatment regimens and keeping to them. Advocating good infection control procedures in hospitals and investment in surveillance programs monitoring patterns of resistance locally and across the country would reduce the spread of infection. However, the biggest problem is capitalism and the fact that there is not a supply of new antimicrobials.

Between 1929 and the 1970s pharmaceutical companies developed more than twenty new classes of antimicrobials.[4][5] Since the 1970s only two new categories of antimicrobials have arrived.[6][7] Today the pipeline for new antibiotic classes active against highly resistant Gram negative bacteria is dry [8][9][10] the only novel category in early clinical development has recently been withdrawn.[9][11]

For the last seventy years the human race has kept itself ahead of resistant bacteria by going back into the laboratory and developing the next generation of antimicrobials. However, due to a failure of the market, pharmaceutical companies are no longer interested in developing antibiotics.

Despite the warnings from Dame Sally Davies, drug companies have pulled back from antimicrobial research because there is no profit to be made from it. When used appropriately a single £100 course of antibiotics will save someone’s life. However, that clinical effectiveness and short-term use has the unfortunate consequence of making antimicrobials significantly less profitable than the pharmaceuticals used in cancer therapy, which can cost £20,000 per year.

In our current system, a drug company’s return on their financial investment in antimicrobials is dependent on their volume of sales. A further problem arises when we factor in the educational programs aimed at teaching healthcare professionals and animal keepers to limit their use of antimicrobials. This combined with the relative unprofitability has produced a failure in the market and a paradox for capitalism.

A response commonly proposed by my fellow scientists, is that our government must provide incentives for pharmaceutical companies to develop new antimicrobial drugs. Suggestions are primarily focused around reducing the financial risk for drugs companies and include grants, prizes, tax breaks, creating public-private partnerships and increasing intellectual property protections. Further suggestions are often related to removing “red tape” and streamlining the drug approval and clinical trial requirements.

In September 2013 the Department of Health published its UK Five Year Antimicrobial Resistance Strategy.[12] The document called for “work to reform and harmonise regulatory regimes relating to the licencing and approval of antibiotics”, better collaboration “encouraging greater public-private investment in the discovery and development of a sustainable supply of effective new antimicrobials” and states that “Industry has a corporate and social responsibility to contribute to work to tackle antimicrobial resistance.”

I think we should have three major objections to these statements. One, the managers in the pharmaceutical industry do not have any responsibility to contribute to work to tackle antimicrobial resistance. They have a responsibility to practice within the law or be fined and make profit for shareholders or be replaced. It is the state that has the responsibility for the protection and wellbeing of its citizens.

Secondly, following this year’s horsemeat scandal we should object to companies cutting corners in attempt to increase profits. This leads on to the final objection, that by promoting public-private collaboration all the state is doing, is subsidising share holder profits by reducing the shareholder’s financial risk.

The market has failed and novel antimicrobials will require investment not based on a financial return from the volume of antibiotics sold but on the benefit for society of being free from disease.

John Maynard Keynes in his 1924, Sydney Ball Foundation Lecture at Cambridge, said “the important thing for government is not to do things which individuals are doing already, and to do them a little better or a little worse; but to do those things which at present are not done at all”.[13] Mariana Mazzucato in her 2013 book, The Entrepreneurial State, discusses how the state can lead innovation and criticises the risk and reward relationships in current public-private partnerships.[14] Mazzacuto argues that the state can be entrepreneurial and inventive and that we need to reinvent the state and government.

This praise of the potential of the state seems to be supported by the public as following announcements of energy price rises, in October 2013, a YouGov poll found that 12 to 1 people were against the NHS being run by the private sector; 67% in favour of Royal Mail being run in the public sector; 66% want railway companies to be nationalised and 68% are in favour of nationalised energy companies.[15]

We should support state funded professors, post-doctoral researchers and PhD students as scientists working within the public sector. They could study the mechanisms of drug entry into bacterial cells or screen natural antibiotic compounds. This could not be done on a shoestring budget and it would no doubt take years to build the infrastructure but we could do things like make the case for where the research took place.

Andrew Witty’s recent review of higher education and regional growth asked universities to become more involved in their local economies.[16] The state could choose to build laboratories in geographical areas neglected by private sector investment and help promote regional recovery. Even more radically, if novel antibiotics are produced for their social good rather than financial gain, they can be reserved indefinitely until a time of crisis.

With regard to democracy, patients and the general public could have a greater say in what is researched and to help shift us away from our reliance on the market to provide what society needs.  The market responds, not to what society needs, but to what will create the most profit. This is a reoccurring theme throughout science. I cannot begin to tell you how frequently I listen to case studies regarding parasites which only affect people in the developing world. Again, the people of the developing world have very little money so drug companies neglect to develop drugs as there is no source of profit. We should make the case for innovation not to be driven by greed but for the service of society and even our species.

Before Friedrich Hayek, John Desmond Bernal in his 1939 book, The Social Function to Science, argued for more spending on innovation as science was not merely an abstract intellectual enquiry but of real practical value.[17] Bernal placed science and technology as one of the driving forces of history. Why should we not follow that path?

Link: The Nazi Anatomists

Link: YOU HAVE DIED OF DYSENTERY

Children of the 70s and 80s will likely remember Oregon Trail, the computer game where the player assumes the role of wagon leader and guides a group of settlers through the pioneer landscape of 19th-century America. You would hunt bison, shoot rabbits, ford rivers and pick up other settlers as you made your way from Missouri to Oregon. But just as you really got into the game, this would happen.

If you are like me, you probably shouted: ‘NOT AGAIN!’

So what exactly is dysentery, and why did you and all your settlers keep dying from it in Oregon Trail?

Dysentery is an intestinal inflammation that causes severe diarrhea, usually characterised by mucus or blood in the feces. Left untreated, the disease can lead to rapid loss of fluids, dehydration, and eventually death.

There are two forms of dysentery. One is caused by a bacterium, the other, an amoeba. The former is the most common in Western Europe and the United States; and is typically spread through contaminated food and water.

Outbreaks of dysentery were more prevalent during war, where the disease spread rampantly because of the unhygienic conditions of the camps. During the Mexican War (1846-48), a staggering 88% of deaths were due to infectious disease, most of those overwhelmingly dysentery. For every man killed in battle, seven died of disease. The American Civil War was no better. You were more likely to die off the battlefield than on it, and dysentery was the primary cause.

That said, civilians also died of dysentery with some frequency in the 19th century, especially those who were itinerant. Pioneers travelling the Oregon Trail wouldn’t have faired much better than soldiers fighting in war. They would have travelled in large groups—wagon after wagon trailing one another—and their access to clean water and food would have been severely limited. In 1853, one pioneer wrote in her diary: ‘Still in camp, husband and myself being sick (caused, we suppose, by drinking the river water, as it looks like dirty suds than anything else)’.

Diseases such as tuberculosis, flu, measles and smallpox spread like wildfire through their crowded, makeshift camps. Dysentery would have been one of the leading causes of death amongst these pioneers, although it is difficult to determine just how many died from it as medical records were typically not kept.

What we do know is that roughly 20,000 people died travelling the 2,000-mile trail in the 19th century. To put that in perspective: there was an average of ten graves per mile. Burials were often hastily done right in the middle of the trail. This would allow wagons and animals to trample down the grave so that the scent of decomposition was erased and wolves wouldn’t feast on the remains.

In another diary from the period, one pioneer writes: ‘A grave three feet deep and wide enough to receive the eleven victims [of a massacre] was dug, and the bodies placed in it. Wolves excavated the grave and devoured the remains…[Volunteers] gathered up the bones, placed them in a wagon box, and again buried them.’

So there you have it.  Life on the Oregon Trail was just as rough as the computer game would have us believe. Food was scarce. Roads were treacherous. And disease was rampant.

I will never again complain about the inconveniences of air travel.

Link: Beyond Recognition

The incredible story of a face transplant.

… Like the patients who came before her, Tarleton’s journey has been something of an unfathomable one. In the summer of 2007, she was the victim of a brutal attack perpetrated by her ex-husband, Herbert Rodgers. He broke into her home in the dead of night, carrying a baseball bat and a bottle of industrial-strength lye. He used both, and he didn’t stop until Tarleton had sustained what one doctor later described as “the most horrific injury a human being could suffer.”

Tarleton awoke from a three-month induced coma in September of that year. Her body, marred by deep chemical burns, was wrapped in bandages and covered in grafts — some taken from cadavers, the rest harvested from her own legs. Her eyelids were gone, as was her left ear. She couldn’t blink, smile, or breathe through her nose.

During that coma, doctors performed 38 surgeries to repair what deficits they could. And over a period of five years, she would undergo another 17 operations, including a series of synthetic corneal implants that eventually restored partial vision to one eye. Despite these efforts, Tarleton’s progress eventually stalled — given the limitations of conventional procedures, it was impossible that full facial functions, from movement to sensation, would ever return. And her face, there was no question, would never look the way it had before. “I had forgotten what it was like to look more normal,” she says. “I had to accept that I would always look this way, and I had to be okay with that.”

Ironically, it wasn’t until Tarleton had cultivated this acceptance, she says, that the prospect of a face transplant emerged. In December of 2011, she received a striking proposition from Dr. Bohdan Pomahac at Brigham and Women’s Hospital in Boston. He had recently performed the first successful full face-transplant in the US, and he wanted to know if Tarleton would consider the procedure.

It wasn’t an easy answer. Before being approved for a face transplant, Tarleton would need to travel two hours from her home in Vermont to Boston, several times over several months, for extensive physical and psychological exams. Doctors needed to be sure that her immune system could cope with the procedure, and assess the blood vessels, nerves, and muscles deep within her skull. A team of psychological experts would evaluate Tarleton’s mental health and the strength of her support network. The procedure itself would be grueling and dangerous, and the rehabilitation process would be extensive. But the payoff — the prospect of eyes that could blink, a mouth able to kiss — would transform her life.

Several months after that call, Tarleton had cleared every hurdle, and her name was added to a waitlist while surgeons scoured for viable donors. To meet the criteria, a donor had to be brain dead with no prospect for recovery — the harvested tissue needs to be flushed with blood and nutrients until the last possible moment — and be an adequate match for Tarleton’s skin tone and texture, as well as her age and sex. In her case, it took 14 months before that donor, Cheryl, was found.

Link: The Obesity Era

As the American people got fatter, so did marmosets, vervet monkeys and mice. The problem may be bigger than any of us. 

Years ago, after a plane trip spent reading Fyodor Dostoyevsky’s Notes from the Underground and Weight Watchers magazine, Woody Allen melded the two experiences into a single essay. ‘I am fat,’ it began. ‘I am disgustingly fat. I am the fattest human I know. I have nothing but excess poundage all over my body. My fingers are fat. My wrists are fat. My eyes are fat. (Can you imagine fat eyes?).’ It was 1968, when most of the world’s people were more or less ‘height-weight proportional’ and millions of the rest were starving. Weight Watchers was a new organisation for an exotic new problem. The notion that being fat could spur Russian-novel anguish was good for a laugh.

That, as we used to say during my Californian adolescence, was then. Now, 1968’s joke has become 2013’s truism. For the first time in human history, overweight people outnumber the underfed, and obesity is widespread in wealthy and poor nations alike. The diseases that obesity makes more likely — diabetes, heart ailments, strokes, kidney failure — are rising fast across the world, and the World Health Organisation predicts that they will be the leading causes of death inall countries, even the poorest, within a couple of years. What’s more, the long-term illnesses of the overweight are far more expensive to treat than the infections and accidents for which modern health systems were designed. Obesity threatens individuals with long twilight years of sickness, and health-care systems with bankruptcy.

And so the authorities tell us, ever more loudly, that we are fat — disgustingly, world-threateningly fat. We must take ourselves in hand and address our weakness. After all, it’s obvious who is to blame for this frightening global blanket of lipids: it’s us, choosing over and over again, billions of times a day, to eat too much and exercise too little. What else could it be? If you’re overweight, it must be because you are not saying no to sweets and fast food and fried potatoes. It’s because you take elevators and cars and golf carts where your forebears nobly strained their thighs and calves. How could you dothis to yourself, and to society?

Moral panic about the depravity of the heavy has seeped into many aspects of life, confusing even the erudite. Earlier this month, for example, the American evolutionary psychologist Geoffrey Miller expressed the zeitgeist in this tweet: ‘Dear obese PhD applicants: if you don’t have the willpower to stop eating carbs, you won’t have the willpower to do a dissertation. #truth.’ Businesses are moving to profit on the supposed weaknesses of their customers. Meanwhile, governments no longer presume that their citizens know what they are doing when they take up a menu or a shopping cart. Yesterday’s fringe notions are becoming today’s rules for living — such as New York City’s recent attempt to ban large-size cups for sugary soft drinks, or Denmark’s short-lived tax surcharge on foods that contain more than 2.3 per cent saturated fat, or Samoa Air’s 2013 ticket policy, in which a passenger’s fare is based on his weight because: ‘You are the master of your air ‘fair’, you decide how much (or how little) your ticket will cost.’

Several governments now sponsor jauntily named pro-exercise programmes such as Let’s Move! (US), Change4Life (UK) and actionsanté (Switzerland). Less chummy approaches are spreading, too. Since 2008, Japanese law requires companies to measure and report the waist circumference of all employees between the ages of 40 and 74 so that, among other things, anyone over the recommended girth can receive an email of admonition and advice.

Hand-in-glove with the authorities that promote self-scrutiny are the businesses that sell it, in the form of weight-loss foods, medicines, services, surgeries and new technologies. A Hong Kong company named Hapilabs offers an electronic fork that tracks how many bites you take per minute in order to prevent hasty eating: shovel food in too fast and it vibrates to alert you. A report by the consulting firm McKinsey & Co predicted in May 2012 that ‘health and wellness’ would soon become a trillion-dollar global industry. ‘Obesity is expensive in terms of health-care costs,’ it said before adding, with a consultantly chuckle, ‘dealing with it is also a big, fat market.’

And so we appear to have a public consensus that excess body weight (defined as a Body Mass Index of 25 or above) and obesity (BMI of 30 or above) are consequences of individual choice. It is undoubtedly true that societies are spending vast amounts of time and money on this idea. It is also true that the masters of the universe in business and government seem attracted to it, perhaps because stern self-discipline is how many of them attained their status. What we don’t know is whether the theory is actually correct.

Of course, that’s not the impression you will get from the admonishments of public-health agencies and wellness businesses. They are quick to assure us that ‘science says’ obesity is caused by individual choices about food and exercise. As the Mayor of New York, Michael Bloomberg, recently put it, defending his proposed ban on large cups for sugary drinks: ‘If you want to lose weight, don’t eat. This is not medicine, it’s thermodynamics. If you take in more than you use, you store it.’ (Got that? It’s not complicated medicine, it’s simple physics, the most sciencey science of all.)

Yet the scientists who study the biochemistry of fat and the epidemiologists who track weight trends are not nearly as unanimous as Bloomberg makes out. In fact, many researchers believe that personal gluttony and laziness cannot be the entire explanation for humanity’s global weight gain. Which means, of course, that they think at least some of the official focus on personal conduct is a waste of time and money. As Richard L Atkinson, Emeritus Professor of Medicine and Nutritional Sciences at the University of Wisconsin and editor of the International Journal of Obesity, put it in 2005: ‘The previous belief of many lay people and health professionals that obesity is simply the result of a lack of willpower and an inability to discipline eating habits is no longer defensible.’

Link: The Girl Who Turned to Bone

Unexpected discoveries in the quest to cure an extraordinary skeletal condition show how medically relevant rare diseases can be.

When Jeannie Peeper was born in 1958, there was only one thing amiss: her big toes were short and crooked. Doctors fitted her with toe braces and sent her home. Two months later, a bulbous swelling appeared on the back of Peeper’s head. Her parents didn’t know why: she hadn’t hit her head on the side of her crib; she didn’t have an infected scratch. After a few days, the swelling vanished as quickly as it had arrived.

When Peeper’s mother noticed that the baby couldn’t open her mouth as wide as her sisters and brothers, she took her to the first of various doctors, seeking an explanation for her seemingly random assortment of symptoms. Peeper was 4 when the Mayo Clinic confirmed a diagnosis: she had a disorder known as fibrodysplasia ossificans progressiva (FOP).

The name meant nothing to Peeper’s parents—unsurprising, given that it is one of the rarest diseases in the world. One in 2 million people have it.

Peeper’s diagnosis meant that, over her lifetime, she would essentially develop a second skeleton. Within a few years, she would begin to grow new bones that would stretch across her body, some fusing to her original skeleton. Bone by bone, the disease would lock her into stillness. The Mayo doctors didn’t tell Peeper’s parents that. All they did say was that Peeper would not live long.

“Basically, my parents were told there was nothing that could be done,” Peeper told me in October. “They should just take me home and enjoy their time with me, because I would probably not live to be a teenager.” We were in Oviedo, Florida, in an office with a long, narrow sign that read The International Fibrodysplasia Ossificans Progressiva Association. Peeper founded the association 25 years ago, and remains its president. She was dressed in a narrow-waisted black skirt and a black-and-white striped blouse. A large ring in the shape of a black flower encircled one of her fingers. Her hair was peach-colored.

Peeper sat in a hulking electric wheelchair tilted back at a 30-degree angle. Her arms were folded, like those of a teacher who has run out of patience. Her left hand was locked next to her right biceps. I could make out some of the bones under the skin of her left arm: long, curved, extraneous.

“It’s good to finally meet you,” she said when I walked in. Her face was almost entirely frozen; she spoke by drawing her lower lip down and out to the sides. Bones had immobilized her neck, so she had to look at me with a sidelong gaze. Her right hand, resting on her wheelchair’s joystick, contained the only free-moving joint in her body. It rose and swung toward me. We shook hands.

Peeper’s condition is extremely rare—but in that respect, she actually has a lot of company. A rare disease is defined as any condition affecting fewer than 200,000 patients in the United States. More than 7,000 such diseases exist, afflicting a total of 25 million to 30 million Americans.

The symptoms of these diseases may differ, but the people who suffer from them share many experiences. Rare diseases frequently go undiagnosed, or misdiagnosed, for years. Once people do find out that they suffer from a rare disease, many discover that medicine cannot help them. Not only is there no drug to prescribe, but in many cases, scientists have little idea of the underlying cause of the disease. And until recently, people with rare diseases had little reason to hope this would change. The medical-research establishment treated them as a lost cause, funneling resources to more-common ailments like cancer and heart disease.

In 1998, this magazine ran a story recounting the early attempts by scientists to understand fibrodysplasia ossificans progressiva. Since then, their progress has shot forward. The advances have come thanks in part to new ways of studying cells and DNA, and in part to Jeannie Peeper.

Starting in the 1980s, Peeper built a network of people with FOP. She is now connected to more than 500 people with her condition—a sizable fraction of all the people on Earth who suffer from it. Together, members of this community did what the medical establishment could not: they bankrolled a laboratory dedicated solely to FOP and have kept its doors open for more than two decades. They have donated their blood, their DNA, and even their teeth for study.

Meanwhile, the medical establishment itself has shifted its approach to rare diseases, figuring out ways to fund research despite the inherently limited audience. Combined with Peeper’s dedication, this sea change has enabled scientists to pinpoint the genetic mutation that causes her disease and to begin developing drugs that could treat, and possibly even cure, it.

Although rare diseases are still among the worst diagnoses to receive, it would not be a stretch to say there’s never been a better time to have one.

When Peeper’s parents received their daughter’s diagnosis, they didn’t tell her. She enjoyed a kickball-and-bicycles childhood in Ypsilanti, Michigan, and only became aware of her disorder when she was 8.

“I remember vividly, because I was getting dressed for Sunday school,” she told me. She realized that she could no longer fit her left hand through her sleeve. “My left wrist had locked in a backwards position”—the result of a new bone that had grown in her arm.

Peeper’s doctors took a muscle biopsy from her left forearm. Afterward, she wore a cast for six weeks. When it came off, she couldn’t flex her elbow. A new bone had frozen the joint.

Over the next decade, as Peeper grew more bones—rigid sheets stretching across her back, her right elbow locking, her left hip freezing—she became accustomed to pain.

Link: Paracetamol/Acetaminophen Can Soften Our Moral Reactions

Our moral reactions are easily influenced by a variety of factors. One of them isanxiety. When people are confronted with disturbing experiences like mortality salience (i.e., being made aware of their own eventual death), they tend to affirm their moral beliefs. As a result, they feel inclined to punish moral transgression more harshly than they would without feeling fundamentally threatened. For example, in a now classical study people who objected to prostitution were asked to suggest a penalty for a woman arrested for prostitution. Participants who were led to reflect on their own mortality beforehand proposed a far higher bail than participants who thought about a less anxiety inducing topic. Such belief affirmation effects can also be evoked by psychologically disturbing experiences less severe than mortality salience. Hence, anxiety aroused by different situations can make our moral reactions more pronounced.

Some days ago, an interesting study has been published in “Psychological Science”. The authors showed that the common over-the-counter pain reliever paracetamol counteracts the belief-affirming effect of anxiety. Participants who took a placebo showed the familiar response pattern in the “prostitution paradigm”. They suggested a harsher penalty for the prostitute under mortality salience (a bail of around $450) compared to a control condition (around $300). Participants who took paracetamol, however, didn’t react on mortality salience. Independent of what they had reflected on before, they suggested the same penalty for the prostitute (around $300). Paracetamol seems to have reduced the fundamental anxiety participants felt due to the mortality salience manipulation, so they didn’t have to affirm their moral beliefs that strongly. In a second experiment, the same effect of paracetamol was shown using a different disturbing experience (a surrealistic movie instead of mortality salience) and a different measurement for belief affirmation (a fine for rioters instead of a bail for a prostitute).

Hence, besides killing physical pain, paracetamol seems to be capable of counteracting the effect anxiety has on our moral reactions. From a scientific perspective, this certainly is an interesting finding. But what can we make out of it from a practical ethics perspective? If we want a person’s moral reaction to be the result of cognition rather than emotion, paracetamol could be a means for bias reduction. However, some people might argue that in case a person’s moral belief is the “correct” one, wanting transgressions to be punished comparatively severely might not be such a bad thing, even if the motivation for that is anxiety. 

Link: Caring on Stolen Time: A Nursing Home Diary

I work in a place of death. People come here to die, and my co-workers and I care for them as they make their journeys. Sometimes these transitions take years or months. Other times, they take weeks or some short days. I count the time in shifts, in scheduled state visits, in the sham monthly meetings I never attend, in the announcements of the “Employee of the Month” (code word for best ass-kisser of the month), in the yearly pay increment of 20 cents per hour, and in the number of times I get called into the Human Resources office.

The nursing home residents also have their own rhythms. Their time is tracked by scheduled hospital visits; by the times when loved ones drop by to share a meal, to announce the arrival of a new grandchild, or to wait anxiously at their bedsides for heart-wrenching moments to pass. Their time is measured by transitions from processed food to pureed food, textures that match their increasing susceptibility to dysphagia. Their transitions are also measured by the changes from underwear to pull-ups and then to diapers. Even more than the loss of mobility, the use of diapers is often the most dreaded adaptation. For many people, lack of control over urinary functions and timing is the definitive mark of the loss of independence.

Many of the elderly I have worked with are, at least initially, aware of the transitions and respond with a myriad of emotions from shame and anger to depression, anxiety, and fear. Theirs was the generation that survived the Great Depression and fought the last “good war.” Aging was an anti-climactic twist to the purported grandeur and tumultuousness of their mid-twentieth-century youth.

“I am afraid to die. I don’t know where I will go,” a resident named Lara says to me, fear dilating her eyes.

“Lara, you will go to heaven. You will be happy,” I reply, holding the spoonful of pureed spinach to her lips. “Tell me about your son, Tobias.”

And so Lara begins, the same story of Tobias, of his obedience and intelligence, which I have heard over and over again for the past year. The son whom she loves, whose teenage portrait stands by her bedside. The son who has never visited, but whose name and memory calm Lara.

Lara is always on the lookout, especially for Alba and Mary, the two women with severe dementia who sit on both sides of her in the dining room. To find out if Alba is enjoying her meal, she will look to my co-worker Saskia to ask, “Is she eating? If she doesn’t want to, don’t force her to eat. She will eat when she is hungry.” Alba, always cheerful, smiles. Does she understand? Or is she in her usual upbeat mood? “Lara, Alba’s fine. With you watching out for her, of course she’s OK!” We giggle. These are small moments to be cherished.

In the nursing home, such moments are precious because they are accidental moments.

The residents run on stolen time. Alind, like me, a certified nursing assistant (CNA), comments, “Some of these residents are already dead before they come here.”

By “dead,” he is not referring to the degenerative effects of dementia and Alzheimer’s disease but to the sense of hopelessness and loneliness that many of the residents feel, not just because of physical pain, not just because of old age, but as a result of the isolation, the abandonment by loved ones, the anger of being caged within the walls of this institution. This banishment is hardly the ending they toiled for during their industrious youth.

By death, Alind was also referring to the many times “I’m sorry,” is uttered in embarrassment and the tearful shrieks of shame that sometimes follow when they soil their clothes. This is the dying to which we, nursing home workers, bear witness every day; the death that the home is expected, somehow, to reverse.

So management tries, through bowling, through bingo and checkers, through Frank Sinatra sing-a-longs, to resurrect what has been lost to time, migration, the exigencies of the market, and the capriciousness of life. They substitute hot tea and cookies with strangers for the warmth of family and friends. Loved ones occupied by the same patterns of migration, work, ambition, ease their worries and guilt with pictures and reports of their relatives in these settings. We, the CNAs, shuffle in and out of these staged moments, to carry the residents off for toileting. The music playing in the building’s only bright and airy room is not for us, the immigrants, the lower hands, to plan for or share with the residents. Ours is a labor confined to the bathroom, to the involuntary, lower functions of the body. Instead of people of color in uniformed scrubs, white women with pretty clothes are paid more to care for the leisure-time activities of the old white people. The monotony and stress of our tasks are ours to bear alone.

The nursing home bosses freeze the occasional, carefully selected, picture-perfect moments on the front pages of their brochures, exclaiming that their facility, one of a group of Catholic homes is, indeed, a place where ”life is appreciated,” where “we care for the dignity of the human person.” In reality, they have not tried to make that possible. Under poor conditions, we have improvised for genuine human connection to exist. How we do that the bosses do not understand.


Do No Harm: On Body Integrity Disorder
Why do some people want to cut off a perfectly healthy limb?

This wasn’t the first time that David had tried to amputate his leg. When he was just out of college, he’d tried to do it using a tourniquet fashioned out of an old sock and strong baling twine.
David locked himself in his bedroom at his parents’ house, his bound leg propped up against the wall to prevent blood from flowing into it. After two hours the pain was unbearable, and fear sapped his will.
Undoing a tourniquet that has starved a limb of blood can be fatal: injured muscles downstream of the blockage flood the body with toxins, causing the kidneys to fail. Even so, David released the tourniquet himself; it was just as well that he hadn’t mastered the art of tying one.
Failure did not lessen David’s desire to be rid of the leg. It began to consume him, to dominate his awareness. The leg was always there as a foreign body, an impostor, an intrusion.
He spent every waking moment imagining freedom from the leg. He’d stand on his “good” leg, trying not to put any weight on the bad one. At home, he’d hop around. While sitting, he’d often push the leg to one side. The leg just wasn’t his. He began to blame it for keeping him single; but living alone in a small suburban townhouse, afraid to socialise and struggling to form relationships, David was unwilling to let anyone know of his singular fixation.
David is not his real name. He wouldn’t discuss his condition without the protection of anonymity. After he agreed to talk, we met in the waiting area of a nondescript restaurant, in a nondescript mall just outside one of America’s largest cities. A handsome man, David resembles a certain edgy movie star whose name, he fears, might identify him to his co-workers. He’s kept his secret well hidden: I am only the second individual whom he has confided to in person about his leg.
The cheerful guitar music in the restaurant lobby clashed with David’s mood. He choked up as he recounted his depression. I’d heard his voice cracking when we’d spoken earlier on the phone, but watching this grown man so full of emotion was difficult. The restaurant’s buzzer went off. Our table inside was ready, but David didn’t want to go in. Even though his voice was shaking, he wanted to keep talking.
“It got to the point where I’d come into my house and just cry,” he had told me earlier over the phone. “I’d be looking at other people and seeing that they already have their lives going good for them. And I’m stuck here, all miserable. I’m being held back by this strange obsession. The logic going through my head was that I need to take care of this now, because if I wait any longer, there is not much chance of a life for me.”
It took some time for David to open up. Early on, when we were just getting to know each other, he was shy and polite, confessing that he wasn’t very good at talking about himself. He had avoided seeking professional psychiatric help, afraid that doing so would somehow endanger his employment. And yet he knew that he was slipping into a dark place. He began associating his house with the feeling of being alone and depressed. Soon he came home only to sleep; he couldn’t be in the house during the day without breaking into tears.
One night about a year ago, when he could bear it no longer, David called his best friend. There was something he had been wanting to reveal his whole life, David told him. His friend’s response was empathetic — exactly what David needed. Even as David was speaking he began searching online for material. “He told me that there was something in my eyes the whole time I was growing up,” David said. “It looked like I had pain in my eyes, like there was something I wasn’t telling him.” Once David opened up, he discovered that he was not alone. He found a community on the internet of others who were also desperate to excise some part of their body — usually a limb, sometimes two. These people were suffering from what is now called Body Integrity Identity Disorder (BIID).
The online community has been a blessing to those who suffer from BIID, and through it many discover that their malaise has an official name. With a handful of websites and a few thousand members, the community even has its internal subdivisions: “devotees” are fascinated by or attracted to amputees, often sexually, but don’t want amputations themselves; “wannabes” strongly desire an amputation of their own. A further delineation, “need-to-be,” describes someone whose desire for amputation is particularly fierce.
It was a wannabe who told David about a former BIID patient who had been connecting other sufferers to a surgeon in Asia. For a fee, this doctor would perform off-the-book amputations. David contacted this gatekeeper on Facebook, but more than a month passed without a reply. As his hopes of surgery began to fade, David’s depression deepened. The leg intruded more insistently into his thoughts. He decided to try again to get rid of it himself.
This time he settled for dry ice, one of the preferred methods of self-amputation among the BIID community. The idea is to freeze the offending limb and damage it to the point that doctors have no choice but to amputate. David drove over to his local Walmart and bought two large trashcans. The plan was brutal, but simple. First, he would submerge the leg in a can full of cold water to numb it. Then he would pack it in a can full of dry ice until it was injured beyond repair.
He bought rolls of bandages, but he couldn’t find the dry ice or the prescription painkillers he needed if he was going to keep the leg in dry ice for eight hours. David went home despondent, with just two trashcans and bandages, preparing himself mentally to go out the next day to find the other ingredients. The painkillers were essential; he knew that without them he would never succeed. Then, before going to bed that night, he checked his computer.
There it was: a message. The gatekeeper wanted to talk.

We are only just beginning to understand BIID. It hasn’t helped that the medical establishment has generally dismissed the condition as a perversion. Yet there is evidence that it has existed for hundreds of years. In a recent paper, Peter Brugger, the head of neuropsychology at University Hospital Zurich, Switzerland, cites the case of an Englishman who went to France in the late 18th century and asked a surgeon to amputate his leg. When the surgeon refused, the Englishman held him up at gunpoint, forcing him to perform the operation. After returning home, he sent the surgeon 250 guineas and a letter of thanks, in which he wrote that his leg had been “an invincible obstacle” to his happiness.

The first modern account of the condition dates from 1977, whenThe Journal of Sex Research published a paper on “apotemnophilia” — the desire to be an amputee. The paper categorised the desire for amputation as a paraphilia, a catchall term used for deviant sexual desires. Although it’s true that most people who desire such amputations are sexually attracted to amputees, the term paraphilia has long been a convenient label for misunderstandings: after all, at one time homosexuality was also labelled as paraphilia.
One of the co-authors of the 1977 paper was Gregg Furth, who eventually became a practising psychologist in New York. Furth himself suffered from the condition and, over time, became a major figure in the BIID underground. He wanted to help people deal with their problem, but medical treatment was always controversial — often for good reason. In 1998, Furth introduced a friend to an unlicensed surgeon who agreed to amputate the friend’s leg in a Tijuana clinic. The patient died of gangrene and the surgeon was sent to prison. A Scottish surgeon named Robert Smith, who practised at the Falkirk and District Royal Infirmary, briefly held out legal hope for BIID sufferers by openly performing voluntary amputations, but a media frenzy in 2000 led British authorities to forbid such procedures. The Smith affair fuelled a series of articles about the condition — some suggesting that merely identifying and defining such a condition could cause it to spread, like a virus.
Undeterred, Furth found a surgeon in Asia who was willing to perform amputations for about $6,000. But instead of getting the surgery himself, he began acting as a go-between, putting sufferers in touch with the surgeon.
He also contacted Michael First, a clinical psychiatrist at Columbia University in New York. Intrigued, First embarked on a survey of 52 patients. What he found was illuminating. The patients all seemed to be obsessed by the thought of a body that was different in some way from the one they possessed. There seemed to be a mismatch between their internal sense of their own bodies and their physical bodies. First, who would later lobby to have BIID more widely recognised, became convinced that he was looking at a disorder of identity, of the sense of self.
“The name that was originally proposed, apotemnophilia, was clearly a problem,” he told me. “We wanted a word that was parallel to gender identity disorder. GID has built into the name a concept that there is a function called gender identity, which is your sense of being male or female, which has gone wrong. So, what would be a parallel notion? Body integrity identity disorder hypothesises that a normal function, which is your comfort in how your body fits together, has gone wrong.”

Do No Harm: On Body Integrity Disorder

Why do some people want to cut off a perfectly healthy limb?

This wasn’t the first time that David had tried to amputate his leg. When he was just out of college, he’d tried to do it using a tourniquet fashioned out of an old sock and strong baling twine.

David locked himself in his bedroom at his parents’ house, his bound leg propped up against the wall to prevent blood from flowing into it. After two hours the pain was unbearable, and fear sapped his will.

Undoing a tourniquet that has starved a limb of blood can be fatal: injured muscles downstream of the blockage flood the body with toxins, causing the kidneys to fail. Even so, David released the tourniquet himself; it was just as well that he hadn’t mastered the art of tying one.

Failure did not lessen David’s desire to be rid of the leg. It began to consume him, to dominate his awareness. The leg was always there as a foreign body, an impostor, an intrusion.

He spent every waking moment imagining freedom from the leg. He’d stand on his “good” leg, trying not to put any weight on the bad one. At home, he’d hop around. While sitting, he’d often push the leg to one side. The leg just wasn’t his. He began to blame it for keeping him single; but living alone in a small suburban townhouse, afraid to socialise and struggling to form relationships, David was unwilling to let anyone know of his singular fixation.

David is not his real name. He wouldn’t discuss his condition without the protection of anonymity. After he agreed to talk, we met in the waiting area of a nondescript restaurant, in a nondescript mall just outside one of America’s largest cities. A handsome man, David resembles a certain edgy movie star whose name, he fears, might identify him to his co-workers. He’s kept his secret well hidden: I am only the second individual whom he has confided to in person about his leg.

The cheerful guitar music in the restaurant lobby clashed with David’s mood. He choked up as he recounted his depression. I’d heard his voice cracking when we’d spoken earlier on the phone, but watching this grown man so full of emotion was difficult. The restaurant’s buzzer went off. Our table inside was ready, but David didn’t want to go in. Even though his voice was shaking, he wanted to keep talking.

“It got to the point where I’d come into my house and just cry,” he had told me earlier over the phone. “I’d be looking at other people and seeing that they already have their lives going good for them. And I’m stuck here, all miserable. I’m being held back by this strange obsession. The logic going through my head was that I need to take care of this now, because if I wait any longer, there is not much chance of a life for me.”

It took some time for David to open up. Early on, when we were just getting to know each other, he was shy and polite, confessing that he wasn’t very good at talking about himself. He had avoided seeking professional psychiatric help, afraid that doing so would somehow endanger his employment. And yet he knew that he was slipping into a dark place. He began associating his house with the feeling of being alone and depressed. Soon he came home only to sleep; he couldn’t be in the house during the day without breaking into tears.

One night about a year ago, when he could bear it no longer, David called his best friend. There was something he had been wanting to reveal his whole life, David told him. His friend’s response was empathetic — exactly what David needed. Even as David was speaking he began searching online for material. “He told me that there was something in my eyes the whole time I was growing up,” David said. “It looked like I had pain in my eyes, like there was something I wasn’t telling him.” Once David opened up, he discovered that he was not alone. He found a community on the internet of others who were also desperate to excise some part of their body — usually a limb, sometimes two. These people were suffering from what is now called Body Integrity Identity Disorder (BIID).

The online community has been a blessing to those who suffer from BIID, and through it many discover that their malaise has an official name. With a handful of websites and a few thousand members, the community even has its internal subdivisions: “devotees” are fascinated by or attracted to amputees, often sexually, but don’t want amputations themselves; “wannabes” strongly desire an amputation of their own. A further delineation, “need-to-be,” describes someone whose desire for amputation is particularly fierce.

It was a wannabe who told David about a former BIID patient who had been connecting other sufferers to a surgeon in Asia. For a fee, this doctor would perform off-the-book amputations. David contacted this gatekeeper on Facebook, but more than a month passed without a reply. As his hopes of surgery began to fade, David’s depression deepened. The leg intruded more insistently into his thoughts. He decided to try again to get rid of it himself.

This time he settled for dry ice, one of the preferred methods of self-amputation among the BIID community. The idea is to freeze the offending limb and damage it to the point that doctors have no choice but to amputate. David drove over to his local Walmart and bought two large trashcans. The plan was brutal, but simple. First, he would submerge the leg in a can full of cold water to numb it. Then he would pack it in a can full of dry ice until it was injured beyond repair.

He bought rolls of bandages, but he couldn’t find the dry ice or the prescription painkillers he needed if he was going to keep the leg in dry ice for eight hours. David went home despondent, with just two trashcans and bandages, preparing himself mentally to go out the next day to find the other ingredients. The painkillers were essential; he knew that without them he would never succeed. Then, before going to bed that night, he checked his computer.

There it was: a message. The gatekeeper wanted to talk.

We are only just beginning to understand BIID. It hasn’t helped that the medical establishment has generally dismissed the condition as a perversion. Yet there is evidence that it has existed for hundreds of years. In a recent paper, Peter Brugger, the head of neuropsychology at University Hospital Zurich, Switzerland, cites the case of an Englishman who went to France in the late 18th century and asked a surgeon to amputate his leg. When the surgeon refused, the Englishman held him up at gunpoint, forcing him to perform the operation. After returning home, he sent the surgeon 250 guineas and a letter of thanks, in which he wrote that his leg had been “an invincible obstacle” to his happiness.

The first modern account of the condition dates from 1977, whenThe Journal of Sex Research published a paper on “apotemnophilia” — the desire to be an amputee. The paper categorised the desire for amputation as a paraphilia, a catchall term used for deviant sexual desires. Although it’s true that most people who desire such amputations are sexually attracted to amputees, the term paraphilia has long been a convenient label for misunderstandings: after all, at one time homosexuality was also labelled as paraphilia.

One of the co-authors of the 1977 paper was Gregg Furth, who eventually became a practising psychologist in New York. Furth himself suffered from the condition and, over time, became a major figure in the BIID underground. He wanted to help people deal with their problem, but medical treatment was always controversial — often for good reason. In 1998, Furth introduced a friend to an unlicensed surgeon who agreed to amputate the friend’s leg in a Tijuana clinic. The patient died of gangrene and the surgeon was sent to prison. A Scottish surgeon named Robert Smith, who practised at the Falkirk and District Royal Infirmary, briefly held out legal hope for BIID sufferers by openly performing voluntary amputations, but a media frenzy in 2000 led British authorities to forbid such procedures. The Smith affair fuelled a series of articles about the condition — some suggesting that merely identifying and defining such a condition could cause it to spread, like a virus.

Undeterred, Furth found a surgeon in Asia who was willing to perform amputations for about $6,000. But instead of getting the surgery himself, he began acting as a go-between, putting sufferers in touch with the surgeon.

He also contacted Michael First, a clinical psychiatrist at Columbia University in New York. Intrigued, First embarked on a survey of 52 patients. What he found was illuminating. The patients all seemed to be obsessed by the thought of a body that was different in some way from the one they possessed. There seemed to be a mismatch between their internal sense of their own bodies and their physical bodies. First, who would later lobby to have BIID more widely recognised, became convinced that he was looking at a disorder of identity, of the sense of self.

“The name that was originally proposed, apotemnophilia, was clearly a problem,” he told me. “We wanted a word that was parallel to gender identity disorder. GID has built into the name a concept that there is a function called gender identity, which is your sense of being male or female, which has gone wrong. So, what would be a parallel notion? Body integrity identity disorder hypothesises that a normal function, which is your comfort in how your body fits together, has gone wrong.”

Link: Why an MRI costs $1,080 in the US & $280 in France

There is a simple reason health care in the United States costs more than it does anywhere else: The prices are higher.

That may sound obvious. But it is, in fact, key to understanding one of the most pressing problems facing our economy. In 2009, Americans spent $7,960 per person on health care. Our neighbors in Canada spent $4,808. The Germans spent $4,218. The French, $3,978. If we had the per-person costs of any of those countries, America’s deficits would vanish. Workers would have much more money in their pockets. Our economy would grow more quickly, as our exports would be more competitive.

There are many possible explanations for why Americans pay so much more. It could be that we’re sicker. Or that we go to the doctor more frequently. But health researchers have largely discarded these theories. As Gerard Anderson, Uwe Reinhardt, Peter Hussey and Varduhi Petrosyan put it in the title of their influential 2003 study on international health-care costs, “it’s the prices, stupid.”

As it’s difficult to get good data on prices, that paper blamed prices largely by eliminating the other possible culprits. They authors considered, for instance, the idea that Americans were simply using more health-care services, but on close inspection, found that Americans don’t see the doctor more often or stay longer in the hospital than residents of other countries. Quite the opposite, actually. We spend less time in the hospital than Germans and see the doctor less often than the Canadians.

“The United States spends more on health care than any of the other OECD countries spend, without providing more services than the other countries do,” they concluded. “This suggests that the difference in spending is mostly attributable to higher prices of goods and services.”

On Friday, the International Federation of Health Plans — a global insurance trade association that includes more than 100 insurers in 25 countries — released more direct evidence. It surveyed its members on the prices paid for 23 medical services and products in different countries, asking after everything from a routine doctor’s visit to a dose of Lipitor to coronary bypass surgery. And in 22 of 23 cases, Americans are paying higher prices than residents of other developed countries. Usually, we’re paying quite a bit more. The exception is cataract surgery, which appears to be costlier in Switzerland, though cheaper everywhere else.

Prices don’t explain all of the difference between America and other countries. But they do explain a big chunk of it. The question, of course, is why Americans pay such high prices — and why we haven’t done anything about it.

“Other countries negotiate very aggressively with the providers and set rates that are much lower than we do,” Anderson says. They do this in one of two ways. In countries such as Canada and Britain, prices are set by the government. In others, such as Germany and Japan, they’re set by providers and insurers sitting in a room and coming to an agreement, with the government stepping in to set prices if they fail.

Health care is an unusual product in that it is difficult, and sometimes impossible, for the customer to say “no.” In certain cases, the customer is passed out, or otherwise incapable of making decisions about her care, and the decisions are made by providers whose mandate is, correctly, to save lives rather than money.

In America, Medicare and Medicaid negotiate prices on behalf of their tens of millions of members and, not coincidentally, purchase care at a substantial markdown from the commercial average. But outside that, it’s a free-for-all. Providers largely charge what they can get away with, often offering different prices to different insurers, and an even higher price to the uninsured.

In other cases, there is more time for loved ones to consider costs, but little emotional space to do so — no one wants to think there was something more they could have done to save their parent or child. It is not like buying a television, where you can easily comparison shop and walk out of the store, and even forgo the purchase if it’s too expensive. And imagine what you would pay for a television if the salesmen at Best Buy knew that you couldn’t leave without making a purchase.

“In my view, health is a business in the United States in quite a different way than it is elsewhere,” says Tom Sackville, who served in Margaret Thatcher’s government and now directs the IFHP. “It’s very much something people make money out of. There isn’t too much embarrassment about that compared to Europe and elsewhere.”

The result is that, unlike in other countries, sellers of health-care services in America have considerable power to set prices, and so they set them quite high. Two of the five most profitable industries in the United States — the pharmaceuticals industry and the medical device industry — sell health care. With margins of almost 20 percent, they beat out even the financial sector for sheer profitability.

Link: Scott and Scurvy

How the cure for scurvy, discovered in 1747, had been forgotten by the time of Scott’s expedition to the Antarctic in 1911.

Recently I have been re-reading one of my favorite books, The Worst Journey in the World, an account of Robert Falcon Scott’s 1911 expedition to the South Pole. I can’t do the book justice in a summary, other than recommend that you drop everything and read it, but there is one detail that particularly baffled me the first time through, and that I resolved to understand better once I could stand to put the book down long enough.

Writing about the first winter the men spent on the ice, Cherry-Garrard casually mentions an astonishing lecture on scurvy by one of the expedition’s doctors:

Atkinson inclined to Almroth Wright’s theory that scurvy is due to an acid intoxication of the blood caused by bacteria…
There was little scurvy in Nelson’s days; but the reason is not clear, since, according to modern research, lime-juice only helps to prevent it. We had, at Cape Evans, a salt of sodium to be used to alkalize the blood as an experiment, if necessity arose. Darkness, cold, and hard work are in Atkinson’s opinion important causes of scurvy.

Now, I had been taught in school that scurvy had been conquered in 1747, when the Scottish physician James Lind proved in one of the first controlled medical experiments that citrus fruits were an effective cure for the disease. From that point on, we were told, the Royal Navy had required a daily dose of lime juice to be mixed in with sailors’grog, and scurvy ceased to be a problem on long ocean voyages.

But here was a Royal Navy surgeon in 1911 apparently ignorant of what caused the disease, or how to cure it. Somehow a highly-trained group of scientists at the start of the 20th century knew less about scurvy than the average sea captain in Napoleonic times. Scott left a base abundantly stocked with fresh meat, fruits, apples, and lime juice, and headed out on the ice for five months with no protection against scurvy, all the while confident he was not at risk. What happened?

By all accounts scurvy is a horrible disease. Scott, who has reason to know, gives a succinct description:

The symptoms of scurvy do not necessarily occur in a regular order, but generally the first sign is an inflamed, swollen condition of the gums. The whitish pink tinge next the teeth is replaced by an angry red; as the disease gains ground the gums become more spongy and turn to a purplish colour, the teeth become loose and the gums sore. Spots appear on the legs, and pain is felt in old wounds and bruises; later, from a slight oedema, the legs, and then the arms, swell to a great size and become blackened behind the joints. After this the patient is soon incapacitated, and the last horrible stages of the disease set in, from which death is a merciful release.

One of the most striking features of the disease is the disproportion between its severity and the simplicity of the cure. Today we know that scurvy is due solely to a deficiency in vitamin C, a compound essential to metabolism that the human body must obtain from food. Scurvy is rapidly and completely cured by restoring vitamin C into the diet.

Except for the nature of vitamin C, eighteenth century physicians knew this too. But in the second half of the nineteenth century, the cure for scurvy was lost. The story of how this happened is a striking demonstration of the problem of induction, and how progress in one field of study can lead to unintended steps backward in another.

An unfortunate series of accidents conspired with advances in technology to discredit the cure for scurvy. What had been a simple dietary deficiency became a subtle and unpredictable disease that could strike without warning. Over the course of fifty years, scurvy would return to torment not just Polar explorers, but thousands of infants born into wealthy European and American homes.

Link: How Doctors Die

It’s not a frequent topic of discussion, but doctors die, too. And they don’t die like the rest of us. What’s unusual about them is not how much treatment they get compared to most Americans, but how little. For all the time they spend fending off the deaths of others, they tend to be fairly serene when faced with death themselves. They know exactly what is going to happen, they know the choices, and they generally have access to any sort of medical care they could want. But they go gently.

Of course, doctors don’t want to die; they want to live. But they know enough about modern medicine to know its limits. And they know enough about death to know what all people fear most: dying in pain, and dying alone. They’ve talked about this with their families. They want to be sure, when the time comes, that no heroic measures will happen–that they will never experience, during their last moments on earth, someone breaking their ribs in an attempt to resuscitate them with CPR (that’s what happens if CPR is done right).

Almost all medical professionals have seen what we call “futile care” being performed on people. That’s when doctors bring the cutting edge of technology to bear on a grievously ill person near the end of life. The patient will get cut open, perforated with tubes, hooked up to machines, and assaulted with drugs. All of this occurs in the Intensive Care Unit at a cost of tens of thousands of dollars a day. What it buys is misery we would not inflict on a terrorist. I cannot count the number of times fellow physicians have told me, in words that vary only slightly, “Promise me if you find me like this that you’ll kill me.” They mean it. Some medical personnel wear medallions stamped “NO CODE” to tell physicians not to perform CPR on them. I have even seen it as a tattoo.

To administer medical care that makes people suffer is anguishing. Physicians are trained to gather information without revealing any of their own feelings, but in private, among fellow doctors, they’ll vent. “How can anyone do that to their family members?” they’ll ask. I suspect it’s one reason physicians have higher rates of alcohol abuse and depression than professionals in most other fields. I know it’s one reason I stopped participating in hospital care for the last 10 years of my practice.

How has it come to this–that doctors administer so much care that they wouldn’t want for themselves? The simple, or not-so-simple, answer is this: patients, doctors, and the system.

(Source: sunrec)



Awakening
Since its introduction in 1846, anesthesia has allowed for medical miracles. Limbs can be removed, tumors examined, organs replaced—and a patient will feel and remember nothing. Or so we choose to believe. In reality, tens of thousands of patients each year in the United States alone wake up at some point during surgery. Since their eyes are taped shut and their bodies are usually paralyzed, they cannot alert anyone to their condition. In efforts to eradicate this phenomenon, medicine has been forced to confront how little we really know about anesthesia’s effects on the brain. The doctor who may be closest to a solution may also answer a question that has confounded centuries’ worth of scientists and philosophers: What does it mean to be conscious?
… This experience is called “intraoperative recall” or “anesthesia awareness,” and it’s more common than you might think. Although studies diverge, most experts estimate that for every 1,000 patients who undergo general anesthesia each year in the United States, one to two will experience awareness. Patients who awake hear surgeons’ small talk, the swish and stretch of organs, the suctioning of blood; they feel the probing of fingers, the yanks and tugs on innards; they smell cauterized flesh and singed hair. But because one of the first steps of surgery is to tape patients’ eyes shut, they can’t see. And because another common step is to paralyze patients to prevent muscle twitching, they have no way to alert doctors that they are awake.
Many of these cases are benign: vague, hazy flashbacks. But up to 70 percent of patients who experience awareness suffer long-term psychological distress, including PTSD—a rate five times higher than that of soldiers returning from Iraq and Afghanistan. Campbell now understands that this is what happened to her, although she didn’t believe it at first. “The whole idea of anesthesia awareness seemed over-the-top,” she told me. “It took years to begin to say, ‘I think this is what happened to me.’ ” She describes her memories of the surgery like those from a car accident: the moments before and after are clear, but the actual event is a shadowy blur of emotion. She searched online for people with similar experiences, found a coalition of victims, and eventually traveled up the East Coast to speak with some of them. They all shared a constellation of symptoms: nightmares, fear of confinement, the inability to lie flat (many sleep in chairs), and a sense of having died and returned to life. Campbell (whose name and certain other identifying details have been changed) struggles especially with the knowledge that there is no way for her to prove that she woke up, and that many, if not most, people might not believe her. “Anesthesia awareness is an intrapersonal event,” she says. “No one else sees it. No one else knows it. You’re the only one.”
Sizemore complained of being unable to breathe and claimed that people were trying to bury him alive. He suffered from insomnia; when he could sleep, he had vivid nightmares.
In most cases of awareness, patients are awake but still dulled to pain. But that was not the case for Sherman Sizemore Jr., a Baptist minister and former coal miner who was 73 when he underwent an exploratory laparotomy in early 2006 to pinpoint the cause of recurring abdominal pain. In this type of procedure, surgeons methodically explore a patient’s viscera for evidence of abnormalities. Although there are no official accounts of Sizemore’s experience, his family maintained in a lawsuit that he was awake—and feeling pain—throughout the surgery. (The suit was settled in 2008.) He reportedly emerged from the operation behaving strangely. He was afraid to be left alone. He complained of being unable to breathe and claimed that people were trying to bury him alive. He refused to be around his grandchildren. He suffered from insomnia; when he could sleep, he had vivid nightmares.
The lawsuit claimed that Sizemore was tormented by doubt, wondering whether he had imagined the horrific pain. No one advised Sizemore to seek psychiatric help, his family alleged, and no one mentioned the fact that many patients who experience awareness suffer from PTSD. On February 2, 2006, two weeks after his surgery, Sizemore shot himself. He had no history of psychiatric illness.

Awakening

Since its introduction in 1846, anesthesia has allowed for medical miracles. Limbs can be removed, tumors examined, organs replaced—and a patient will feel and remember nothing. Or so we choose to believe. In reality, tens of thousands of patients each year in the United States alone wake up at some point during surgery. Since their eyes are taped shut and their bodies are usually paralyzed, they cannot alert anyone to their condition. In efforts to eradicate this phenomenon, medicine has been forced to confront how little we really know about anesthesia’s effects on the brain. The doctor who may be closest to a solution may also answer a question that has confounded centuries’ worth of scientists and philosophers: What does it mean to be conscious?

… This experience is called “intraoperative recall” or “anesthesia awareness,” and it’s more common than you might think. Although studies diverge, most experts estimate that for every 1,000 patients who undergo general anesthesia each year in the United States, one to two will experience awareness. Patients who awake hear surgeons’ small talk, the swish and stretch of organs, the suctioning of blood; they feel the probing of fingers, the yanks and tugs on innards; they smell cauterized flesh and singed hair. But because one of the first steps of surgery is to tape patients’ eyes shut, they can’t see. And because another common step is to paralyze patients to prevent muscle twitching, they have no way to alert doctors that they are awake.

Many of these cases are benign: vague, hazy flashbacks. But up to 70 percent of patients who experience awareness suffer long-term psychological distress, including PTSD—a rate five times higher than that of soldiers returning from Iraq and Afghanistan. Campbell now understands that this is what happened to her, although she didn’t believe it at first. “The whole idea of anesthesia awareness seemed over-the-top,” she told me. “It took years to begin to say, ‘I think this is what happened to me.’ ” She describes her memories of the surgery like those from a car accident: the moments before and after are clear, but the actual event is a shadowy blur of emotion. She searched online for people with similar experiences, found a coalition of victims, and eventually traveled up the East Coast to speak with some of them. They all shared a constellation of symptoms: nightmares, fear of confinement, the inability to lie flat (many sleep in chairs), and a sense of having died and returned to life. Campbell (whose name and certain other identifying details have been changed) struggles especially with the knowledge that there is no way for her to prove that she woke up, and that many, if not most, people might not believe her. “Anesthesia awareness is an intrapersonal event,” she says. “No one else sees it. No one else knows it. You’re the only one.”

Sizemore complained of being unable to breathe and claimed that people were trying to bury him alive. He suffered from insomnia; when he could sleep, he had vivid nightmares.

In most cases of awareness, patients are awake but still dulled to pain. But that was not the case for Sherman Sizemore Jr., a Baptist minister and former coal miner who was 73 when he underwent an exploratory laparotomy in early 2006 to pinpoint the cause of recurring abdominal pain. In this type of procedure, surgeons methodically explore a patient’s viscera for evidence of abnormalities. Although there are no official accounts of Sizemore’s experience, his family maintained in a lawsuit that he was awake—and feeling pain—throughout the surgery. (The suit was settled in 2008.) He reportedly emerged from the operation behaving strangely. He was afraid to be left alone. He complained of being unable to breathe and claimed that people were trying to bury him alive. He refused to be around his grandchildren. He suffered from insomnia; when he could sleep, he had vivid nightmares.

The lawsuit claimed that Sizemore was tormented by doubt, wondering whether he had imagined the horrific pain. No one advised Sizemore to seek psychiatric help, his family alleged, and no one mentioned the fact that many patients who experience awareness suffer from PTSD. On February 2, 2006, two weeks after his surgery, Sizemore shot himself. He had no history of psychiatric illness.

Link: Amnesia and the Self That Remains When Memory Is Lost

Tom was one of those people we all have in our lives — someone to go out to lunch with in a large group, but not someone I ever spent time with one-on-one. We had some classes together in college and even worked in the same cognitive psychology lab for a while. But I didn’t really know him. Even so, when I heard that he had brain cancer that would kill him in four months, it stopped me cold.

I was 19 when I first saw him — in a class taught by a famous neuropsychologist, Karl Pribram. I’d see Tom at the coffee house, the library, and around campus. He seemed perennially enthusiastic, and had an exaggerated way of moving that made him seem unusually focused. I found it uncomfortable to make eye contact with him, not because he seemed threatening, but because his gaze was so intense.

Once Tom and I were sitting next to each other when Pribram told the class about a colleague of his who had just died a few days earlier. Pribram paused to look out over the classroom and told us that his colleague had been one of the greatest neuropsychologists of all time. Pribram then lowered his head and stared at the floor for such a long time I thought he might have discovered something there. Without lifting his head, he told us that his colleague had been a close friend, and had telephoned a month earlier to say he had just been diagnosed with a brain tumor growing in his temporal lobe. The doctors said that he would gradually lose his memory — not his ability to form new memories, but his ability to retrieve old ones … in short, to understand who he was.

Tom’s hand shot up. To my amazement, he suggested that Pribram was overstating the connection between temporal-lobe memory and overall identity. Temporal lobe or not, you still like the same things, Tom argued — your sensory systems aren’t affected. If you’re patient and kind, or a jerk, he said, such personality traits aren’t governed by the temporal lobes.

Pribram was unruffled. Many of us don’t realize the connection between memory and self, he explained. Who you are is the sum total of all that you’ve experienced. Where you went to school, who your friends were, all the things you’ve done or — just as importantly — all the things you’ve always hoped to do. Whether you prefer chocolate ice cream or vanilla, action movies or comedies, is part of the story, but the ability to know those preferences through accumulated memory is what defines you as a person. This seemed right to me. I’m not just someone who likes chocolate ice cream, I’m someone who knows, who remembers that I like chocolate ice cream. And I remember my favorite places to eat it, and the people I’ve eaten it with.

Pribram walked up to the lectern and gripped it with both hands. When they had spoken last, his colleague seemed more sad than frightened. He was worried about the loss of self more than the loss of memory. He’d still have his intelligence, the doctors said, but no memories. “What good is one without the other?” his colleague had asked. That was the last time Pribram spoke to him.

From a friend, Pribram had learned that his colleague had decided to go to the Caribbean for a vacation with his wife. One day he just walked out into the ocean and never came back. He couldn’t swim; he must have gone out with the intention of not coming back — before the damage from the tumor could take hold, Pribram said.

The room was silent for 10 or 15 seconds — stone silent. I looked over at Tom’s notebook. “Neuropsychologist contemplates losing his mind,” Tom had written.

If he had lived, Pribram’s colleague would have experienced what neuroscientists call retrograde amnesia. This is the kind of amnesia that is most often dredged up as a plot element in bad comedies and cheap mystery stories; so-and-so gets hit on the head and then can’t remember who he is anymore, wanders around aimlessly, finding himself in zany predicaments, until he gets hit on the head again and his memory remarkably returns. This almost never occurs in real life. Although retrograde amnesia is real, it’s usually the result of a tumor, stroke, or other organic brain trauma. It isn’t restored by a knock on the head. Because they can still form new memories, patients with retrograde amnesia are acutely aware that they have a cognitive deficit, are painfully knowledgeable about what they are losing.

Link: Death at Yosemite: The Story Behind Last Summer's Hantavirus Outbreak

On December 10, Yosemite National Park began demolishing 91 tent cabins in Curry Village, a rustic encampment of 408 canvas-sided cabins jammed into a pine-and-cedar glade near the sloping shoulders of Half Dome. It was here that an outbreak of hantavirus began last summer, infecting at least 10 people and killing three.

But on Sunday, June 10, 2012, the campground seemed idyllic. That weekend held all the promise of early summer. The Curry Village swimming pool was open. The smell of hot dogs and nachos curled out of the snack bar. The sun bounced off the face of Glacier Point. Kids in “Go Climb a Rock” T-shirts shouted and chased each other on bikes.

Sometime that day, a 49-year-old woman from the Los Angeles area arrived at Curry Village’s front desk, a plain wood-floor office that’s often cacophonous with the sound of staffers checking guests in and out. A clerk handed her a key to one of the 91 “signature tent cabins” that opened three years ago—the “new 900s” as they were collectively known. Unlike the older cabins, which are sided with single-ply vinyl-coated canvas, the signature cabins boasted double-wall plywood construction and propane heaters, making them warmer and quieter than the older units.

Off she went, this Southern California lady, to enjoy her Yosemite vacation. We’ll call her Visitor One.

About the same time, another guest checked into Curry Village. He was a 36-year-old man from Alameda County, California, which encompasses Berkeley, Oakland, and the East Bay region. He was given the key to a cabin close to Visitor One’s. He dropped off his things and went about his business. We’ll call him Visitor Two.

We don’t know exactly how Visitors One and Two spent their four days in the park. Medical confidentiality laws forbid public-health officials from releasing their names, and they and their families have chosen to keep their stories private. Maybe they hiked to the top of Half Dome or enjoyed the giant sequoias of the Mariposa Grove. By the following Wednesday, June 13, both visitors had checked out of their Curry Village tent cabins and left the park.

Around Yosemite the summer unfolded quietly. The search-and-rescue team went out on minor events: an ankle fracture on the Panorama Trail, a fallen hiker on the Half Dome cable route. Rangers kept a wary eye on the Cascade Fire, a lightning-sparked wilderness blaze that smoldered through a red fir forest.

Then, in late June, Visitor One fell ill. She might have felt like she had the flu: chills, muscle aches, fever, headache, dizziness, fatigue. The flu goes away after a few days. This didn’t. We do know that, back home, she went to see her doctor. When presented with Visitor One’s symptoms, most physicians would have dismissed it as the flu or, at worse, low-level pneumonia. Her doctor didn’t. They talked about what she might have picked up and where. She mentioned her Yosemite trip. The doctor took the unusual step of calling Charles Mosher, a public-health officer for Mariposa County, which encompasses Yosemite, and asking if there were any known hantavirus cases in the area. “Based on her history and symptoms, [hantavirus] was a definite possibility,” Mosher recalled, so he and Visitor One’s doctor agreed that starting treatment for the virus while awaiting lab confirmation was the prudent way to go. 

That was, given the circumstance, about the worst thing Visitor One could hear.