The Forgetting: Understanding Alzheimer’s: A Biography of a Disease

The Forgetting: Understanding Alzheimer’s: A Biography of a Disease
David Shenk


Winner of the 2002 BMA Popular Medicine Book Prize: This is a haunting literary and scientific examination of Alzheimer’s disease and the race to find a cure.‘A truly remarkable book – the definitive work on Alzheimer’s, both in social and medical terms, “The Forgetting” is incisive, humane, never ponderous, full of dry humour and brilliantly written with quiet, unpretentious authority. As a layman with personal experience of “caring” for an Alzheimer’s sufferer I am well aware of the stages of the disease and its prognosis and ending. Shenk is excellent on all these, and in his reflections on memory and the individual, and the individual’s response to the progress of the disease. I can’t imagine a book on Alzheimer’s being better researched and understood, or presented with greater sympathy.’ John BayleyIn 1906 Alois Alzheimer dissected and examined the cerebral cortex of Auguste D’s brain and became the first scientist in medical history to link a specific brain pathology to behavioural changes. The disease named after him, turns otherwise active and healthy people into living ghosts. It is a rare condition for those in their 40s and 50s but 10% of the 65+ population suffers from it and 50% of the 85+. It is longevity’s revenge and as the baby boom generation drifts into its elderly years the number of Alzheimer’s victims is expected to quadruple, making it the fastest-growing disease in developed countries.As Adam Phillips writes in his foreword ‘This remarkable book will radically change our notions of looking after people and our assumptions about independence. Out of fear of mortality we have idealised health and youth and competence. “The Forgetting” reminds us among many other things that there is more to life than that.’Shenk’s history of Alzheimer’s is both poignant and scientific, grounded by the fundamental belief that memory forms the basis of our selves, our souls, and the meaning in our lives.









THE FORGETTING

Understanding Alzheimer’s:

A Biography of a Disease


DAVID SHENK









COPYRIGHT (#ulink_00fa72ff-f147-58a6-9cab-45f0da0a0a3c)


William Collins

An imprint of HarperCollinsPublishers Ltd. 1 London Bridge Street London SE1 9GF

www.harpercollins.co.uk (http://www.harpercollins.co.uk)

Published by Flamingo 2003

First published in Great Britain by HarperCollinsPublishers 2002

First published in the US by Doubleday 2001

Copyright © David Shenk 2001

David Shenk asserts the moral right to be identified as the author of this work

Several of the names and identifying characteristics

of the individuals depicted in this book have been changed to protect their privacy.

All rights reserved under International and Pan-American Copyright Conventions. By payment of the required fees, you have been granted the nonexclusive, nontransferable right to access and read the text of this ebook on screen. No part of this text may be reproduced, transmitted, downloaded, decompiled, reverse engineered, or stored in or introduced into any information storage and retrieval system, in any form or by any means, whether electronic or mechanical, now known or hereinafter invented, without the express written permission of HarperCollins ebooks

HarperCollinsPublishers has made every reasonable effort to ensure that any picture content and written content in this ebook has been included or removed in accordance with the contractual and technological constraints in operation at the time of publication

Source ISBN: 9780006532088

Ebook Edition © DECEMBER 2013 ISBN: 9780007439669

Version: 2016-09-09




DEDICATION (#ulink_40ece7ca-92ae-50ab-a142-2dc07640cb00)


For Lucy




CONTENTS


Cover (#ub27f7759-823e-5ecb-af95-0d7b85875086)

Title Page (#u7ceeb983-460f-5adc-bbd3-b0d03099ebfe)

Copyright (#uef266ab4-cd61-555d-b436-df943c8ff464)

Dedication (#uc4fd414b-38a3-5b62-9845-9dd8b38f664d)

Prologue (#u5ec71841-8d4b-5d61-8045-cde0d82ef50b)

PART I EARLY STAGE (#u387b9076-b423-5f8b-b9d7-4332be1f728f)

1. I Have Lost Myself (#u40abecf0-ca95-5b51-ad60-b9926dfe13c6)

2. Bothered (#u5d56a072-375d-5214-9fc4-993cf00a6edb)

3. The God Who Forgot and the Man Who Could Not (#u37219ff7-6bdc-58d4-8d86-be180b0090b4)

4. The Race (#u8502a1e0-30a4-51eb-8053-cb72724625be)

5. Irrespective of Age (#litres_trial_promo)

6. A Most Loving Brother (#litres_trial_promo)

PART II MIDDLE STAGE (#litres_trial_promo)

7. Fumbling for the Name of My Wife (#litres_trial_promo)

8. Back to Birth (#litres_trial_promo)

9. National Institute of Alzheimer’s (#litres_trial_promo)

10. Ten Thousand Feet, at Ten O’Clock at Night (#litres_trial_promo)

11. A World of Struldbruggs (#litres_trial_promo)

12. Humanize the Mouse (#litres_trial_promo)

13. We Hope to Radio Back to Earth Images of Beauty Never Seen (#litres_trial_promo)

PART III END STAGE (#litres_trial_promo)

14. Breakthrough? (#litres_trial_promo)

15. One Thousand Subtractions (#litres_trial_promo)

16. Things to Avoid (#litres_trial_promo)

17. The Mice Are Smarter (#litres_trial_promo)

Epilogue (#litres_trial_promo)

Resources for Patients and Families (#litres_trial_promo)

Sources (#litres_trial_promo)

Index (#litres_trial_promo)

Acknowledgments (#litres_trial_promo)

About the Author (#litres_trial_promo)

Praise (#litres_trial_promo)

About the Publisher (#litres_trial_promo)


LEAR: Does any here know me? This is not Lear.

Does Lear walk thus, speak thus? Where are his eyes?

Either his notion weakens, his discernings

Are lethargied—Ha! Waking? ’Tis not so.

Who is it that can tell me who I am?

FOOL: Lear’s shadow.

—William Shakespeare, King Lear




PROLOGUE (#udd8d25af-14c1-5d3d-813c-5cbb6f9f3523)


“When I was younger,” Mark Twain quipped near the end of his life, “I could remember anything, whether it had happened or not; but my faculties are decaying now and soon I shall be so I cannot remember any but the things that never happened.”

At age seventy-two, Twain’s memory and wit were intact. But behind his remark lay a grim recollection of another celebrated writer’s true decline. In December 1877, Twain had come to Boston at the invitation of William Dean Howells, editor of the Atlantic Monthly, to satirize a group of Brahmin intellectuals. Among Twain’s targets that night was the father of American Transcendentalism, Ralph Waldo Emerson.

It was after midnight when Twain finally took to the floor at the Hotel Brunswick to spin his yarn. He told the venerable crowd about a lonely miner who had been victimized by three tramps claiming to be famous American writers. The literary outlaws stormed into the miner’s cabin, ate his beans and bacon, guzzled his whiskey, and stole his only pair of boots. They played cards and fought bitterly. One of the tramps called himself Emerson.

The point of the skit was to poke some harmless fun at Emerson by corrupting some of his noble expressions. As they played cards at the climax of the story, the Emerson hobo spat out contorted fragments of his poem “Brahma.” A mystical paean to immortality, the original included these stanzas:

If the red slayer think he slays,

Or if the slain think he is slain,

They know not well the subtle ways

I keep, and pass, and turn again.

They reckon ill who leave me out;

When me they fly, I am the wings;

I am the doubter and the doubt,

And I the hymn the Brahmin sings.

Twain twisted the verse into drunken poker banter:

I am the doubter and the doubt—

They reckon ill who leave me out,

They know not well the subtle ways I keep,

I pass and deal again.

An elegant master of spoof, Twain was revered around the world as the funniest living man. But on this important night, his material bombed. From the start. Twain drew only silence and quizzical looks, most prominently from Emerson himself. At the finish, Twain later recalled, there “fell a silence weighing many tons to the square inch.” He was humiliated. Shortly afterward, he sent a letter of apology to Emerson.

Only then did Twain learn of the hidden backdrop to his performance: Emerson had been present only in body, not in mind. Emerson’s dead silence and flat affect, Twain discovered, was a function of neither offense nor boredom. As his daughter Ellen wrote to Twain in reply, it was simply that he had not understood a word of what Twain was saying.

At age seventy-four, this was no longer the Ralph Waldo Emerson who had written “Self-Reliance” and Nature; who had said, “Insist on yourself; never imitate”; who had mentored Henry David Thoreau; the Emerson of whom James Russell Lowell had said, “When one meets him the Fall of Adam seems a false report.”

This was now a very different man, a waning crescent, caught in the middle stages of a slow, progressive memory disorder that had ravaged his concentration and short-term memory and so dulled his perceptions that he was no longer able to understand what he read or follow a conversation.

“To my father,” Ellen wrote to Twain of the performance, “it is as if it had not been; he never quite heard, never quite understood it, and he forgets easily and entirely.”

One of the great minds in Western civilization was wasting away inside a still vigorous body, and there was nothing that anyone could hope to do about it.

Taos, New Mexico: March 1999

They came from Melbourne, Mannheim, St. Louis, London, and Kalamazoo; from Lexington, Stockholm, Dallas, Glasgow, Toronto, and Kuopio. From Tokyo, Zurich, and Palo Alto.

Some took two flights, others three or four, followed by a winding three-hour van ride from the floodplains of Albuquerque, up through the high desert terrain of Los Alamos, past the Sandia mountains, past the Jemez volcanic range, past the Camel Rock, Cities of Gold, and OK casinos, up near the foothills of the Sangre de Cristo mountains.

More than two hundred molecular biologists gathered in the small but sprawling city of Taos, amidst the adobe homes and green-chile quesadillas, to share data and hypotheses. This high-altitude, remote desert seemed like a strange place to fight a threatening disease. But specialists at the biannual conference “Molecular Mechanisms in Alzheimer’s Disease” needed a refuge from their routine obligations.

For four and a half days they met in Bataan Hall, an old ballroom converted into a civic center. The room had once been used as a shipping-off point for soldiers in World War II, and later named in memory of those same soldiers’ wretched ordeal in the infamous Bataan Death March. Some five hundred prisoners died each day on that trek, about the same number now dying each day in the U.S. from Alzheimer’s disease.

At 8:00 P.M. on the first evening, Stanley Prusiner, a biologist at the University of California at San Francisco and a 1997 recipient of the Nobel Prize in medicine, rose to give the keynote address. “I can’t compete with Monica,” he began with a shrug. “But I think we all know that we wouldn’t learn anything new.”

Barbara Walters’ much-anticipated TV interview with Monica Lewinsky was starting to air on ABC at that very moment, which further fueled the sense of isolation. The local support staff had just raced home to their televisions to catch the well-lighted promotion for the million-dollar book about the sordid affair with the needy President.

No TVs here. The scientists in this large, windowless chamber were distracted by something else: Alzheimer’s disease was about to become an epidemic. Known as senility for thousands of years, Alzheimer’s had only in the past few decades become a major health problem. Five million Americans and perhaps 15 million people worldwide now had the incurable disease, and those numbers would soon look attractive. Beginning in 2011, the first of the baby boomers would turn sixty-five and start to unravel in significant numbers. By 2050, about 15 million people in the U.S. alone would have Alzheimer’s, at an annual cost of as much as $700 billion.

Other industrialized nations faced the same trends. In Japan, one in three would be elderly by 2050. In Canada, the number of elderly would increase by 50 percent while the working-age population increased by just 2 percent. In Britain and elsewhere in industrialized Europe, eighty-five-and-over would continue to be the fastest growing segment of the population. “We have to solve this problem, or it’s going to overwhelm us,” said Zaven Khachaturian, former director of the Alzheimer’s Research Office at the National Institutes of Health. Alzheimer’s had already become a costly and miserable fixture in society. Unless something was done to stop the disease, it would soon become one of the defining characteristics of civilization, one of the cornerstones of the human experience.

They were here to solve this problem.




PART I (#ulink_facaa3b0-7fdb-5732-b637-12717974c636)


EARLY STAGE (#ulink_facaa3b0-7fdb-5732-b637-12717974c636)



The other day I was all confused in the street for a split second. I had to ask somebody where I was, and I realized the magnitude of this disease. I realized that this is a whole structure in which a window falls out, and then suddenly before you know it, the whole façade breaks apart.

This is the worst thing that can happen to a thinking person. You can feel yourself, your whole inside and outside, break down..

—M.

New York, New York




Chapter 1 I HAVE LOST MYSELF (#ulink_4e4e4f24-5824-59cd-8206-2b4cca381f0f)


A healthy, mature human brain is roughly the size and shape of two adult fists, closed and pressed together at the knuckles. Weighing three pounds, it consists mainly of about a hundred billion nerve cells—neurons—linked to one another in about one hundred trillion separate pathways. It is by far the most complicated system known to exist in nature or civilization, a control center for the coordination of breathing, swallowing, pressure, pain, fear, arousal, sensory perception, muscular movement, abstract thought, identity, mood, and a varied suite of memories in a symphony that is partly predetermined and partly adaptable on the fly. The brain is so ridiculously complex, in fact, that in considering it in any depth one can only reasonably wonder why it works so well so much of the time.

Mostly, we don’t think about it at all. We simply take this nearly silent, ludicrously powerful electrochemical engine for granted. We feed it, try not to smash it too hard against walls or windshields, and let it work its magic for us.

Only when it begins to fail in some way, only then are we surprised, devastated, and in awe.

On November 25, 1901, a fifty-one-year-old woman with no personal or family history of mental illness was admitted to a psychiatric hospital in Frankfurt, Germany, by her husband, who could no longer ignore or hide quirks and lapses that had overtaken her in recent months. First there were unexplainable bursts of anger, and then a strange series of memory problems. She became increasingly unable to locate things in her own home and began to make surprising mistakes in the kitchen. By the time she arrived at Städtische Irrenanstalt, the Frankfurt Hospital for the Mentally Ill and Epileptics, her condition was as severe as it was curious. The attending doctor, senior physician Alois Alzheimer, began the new file with these notes in the old German Sütterlin script.

She sits on the bed with a helpless expression.

“What is your name?”

Auguste.

“Last name?”

Auguste.

“What is your husband’s name?”

Auguste, I think.

“How long have you been here?”

(She seems to be trying to remember.)

Three weeks.

It was her second day in the hospital. Dr. Alzheimer, a thirty-seven-year-old neuropathologist and clinician from the small Bavarian village of Markbreit-am-Main, observed in his new patient a remarkable cluster of symptoms: severe disorientation, reduced comprehension, aphasia (language impairment), paranoia, hallucinations, and a short-term memory so incapacitated that when he spoke her full-name, Frau Auguste D——, and asked her to write it down, the patient got only as far as “Frau” before needing the doctor to repeat the rest.

He spoke her name again. She wrote “Augu” and again stopped.

When Alzheimer prompted her a third time, she was able to write her entire first name and the initial “D” before finally giving up, telling the doctor, “I have lost myself.”

Her condition did not improve. It became apparent that there was nothing that anyone at this or any other hospital could do for Frau D. except to insure her safety and try to keep her as clean and comfortable as possible. Over the next four and a half years, she became increasingly disoriented, delusional, and incoherent. She was often hostile.

“Her gestures showed a complete helplessness,” Alzheimer later noted in a published report. “She was disoriented as to time and place. From time to time she would state that she did not understand anything, that she felt confused and totally lost. Sometimes she considered the coming of the doctor as an official visit and apologized for not having finished her work, but other times she would start to yell out of the fear that the doctor wanted to operate on her [or] damage her woman’s honor. From time to time she was completely delirious, dragging her blankets and sheets to and fro, calling for her husband and daughter, and seeming to have auditory hallucinations. Often she would scream for hours and hours in a horrible voice.”

By November 1904, three and a half years into her illness, Auguste D. was bedridden, incontinent, and largely immobile. Occasionally, she busied herself with her bed clothes. Notes from October 1905 indicate that she had become permanently curled up in a fetal position, with her knees drawn up to her chest, muttering but unable to speak, and requiring assistance to be fed.

What was this strange disease that would take an otherwise healthy middle-aged woman and slowly—very slowly, as measured against most disease models—peel away, layer by layer, her ability to remember, to communicate her thoughts and finally to understand the world around her? What most struck Alzheimer, an experienced diagnostician, was that this condition could not fit neatly into any of the standard psychiatric boxes. The symptoms of Auguste D. did not present themselves as a case of acute delirium or the consequence of a stroke; both would have come on more suddenly. Nor was this the general paresis—mood changes, hyperactive reflexes, hallucinations—that can set in during the late stages of syphilis. She was clearly not a victim of dementia praecox (what we now call schizophrenia), or Parkinson’s palsy, or Friedreich’s ataxia, or Huntington’s disease, or Korsakoff’s syndrome, or any of the other well-recognized neurological disorders of the day, disorders that Alzheimer routinely treated in his ward. One of the fundamental elements of diagnostic medicine has always been the exercise of exclusion, to systematically rule out whatever can be ruled out and then see what possibilities are left standing. But Alzheimer had nothing left.

What the fifty-one-year-old Auguste D.’s condition did strongly evoke was a well-known ailment among the elderly: a sharp unraveling of memory and mind that had, for more than five thousand years, been accepted by doctors and philosophers as a routine consequence of aging.

History is stacked with colorful, poignant accounts of the elderly behaving in strange ways before they die, losing connection with their memories and the world around them, making rash decisions, acting with the impetuousness and irresponsibility of children. Plato insisted that those suffering from “the influence of extreme old age” should be excused from the commission of the crimes of sacrilege, treachery, and treason. Cicero lamented the folly of “frivolous” old men. Homer, Aristotle, Maimonides, Chaucer, Thackeray, Boswell, Pope, and Swift all wrote of a distressing feebleness of mind that infected those of advancing years.

“Old age,” wrote Roger Bacon, “is the home of forgetfulness.”

Known as morosis in Greek, oblivio and dementia in Latin, dotage in Middle English, démence in French, and fatuity in eighteenth-century English, the condition was definitively termed senile dementia in 1838 by the French psychiatrist Jean Étienne Esquirol. In a depiction any doctor or caregiver would recognize today. Esquirol wrote: “Senile dementia is established slowly. It commences with enfeeblement of memory, particularly the memory of recent impressions.”

But that was senile dementia. What was this? Alois Alzheimer wanted to know. Why did a fifty-one-year-old appear to be going senile? How could Auguste D. be suffering from the influence of extreme old age?

We are the sum of our memories. Everything we know, everything we perceive, every movement we make is shaped by them. “The truth is,” Friedrich Nietzsche wrote, “that, in the process by which the human being, in thinking, reflecting, comparing, separating, and combining … inside that surrounding misty cloud a bright gleaming beam of light arises, only then, through the power of using the past for living and making history out of what has happened, does a person first become a person.”

The Austrian psychiatrist Viktor Frankl made much the same point in Man’s Search for Meaning, his memoir of experiences as a concentration camp inmate. Frankl recalled trying to lift the spirits of his fellow camp inmates on an especially awful day in Dachau: “I did not only talk of the future and the veil which was drawn over it. I also mentioned the past; all its joys, and how its light shone even in the present darkness. [I quoted] a poet … who had written. Was Du erlebst, kann keine Macht der Welt Dir rauben. (What you have experienced, no power on earth can take from you.) Not only our experiences, but all we have done, whatever great thoughts we may have had and all we have suffered, all this is not lost, though it is past; we have brought it into being. Having been is a kind of being, and perhaps the surest kind.”

Emerson was also fascinated by memory—how it worked, why it failed, the ways it shaped human consciousness. Memory, he offered about a decade or so before his own troubles first appeared, is “the cement, the bitumen, the matrix in which the other faculties are embedded … without it all life and thought were an unrelated succession.” While he constructed an elaborate external memory system in topical notebooks, filling thousands of pages of facts and observations that were intricately cross-referenced and indexed, Emerson was also known for his own keen internal memory. He could recite by heart all of Milton’s “Lycidas” and much of Wordsworth, and made it a regular practice to recite poetry to his children on their walks. His journal entries depict an enchantment with the memory feats of others.

He kept a list:

• Frederic the Great knew every bottle in his cellar.

• Magliabecchi wrote off his book from memory.

• Seneca could say 2,000 words in one hearing.

• L. Scipio knew the name of every man in Rome.

• Judge Parsons knew all his dockets next year.

• Themistocles knew the names of all the Athenians.

“We estimate a man by how much he remembers,” Emerson wrote.

Ronald Reagan was never particularly admired for his memory. But in the late 1980s and early ’90s, he slowly began to lose his grasp on ordinary function. In 1992, three years after leaving the White House, Reagan’s forgetting became impossible to ignore. He was eighty-one.

Both his mother and older brother had experienced senility, and he had demonstrated a mild forgetfulness in the late years of his presidency. Like many people who eventually suffer from the disease, Reagan may have had an inkling for some time of what was to come. In his stable of disarming jokes were several about memory troubles afflicting the elderly. He shared one at a 1985 dinner honoring Senator Russell Long.

An elderly couple was getting ready for bed one night, Reagan told the crowd. The wife turned to her husband and said, “I’m just so hungry for ice cream and there isn’t any in the house.”

“I’ll get you some,” her husband offered.

“You’re a dear,” she said. “Vanilla with chocolate sauce. Write it down—you’ll forget.”

“I won’t forget,” he said.

“With whipped cream on top.”

“Vanilla with chocolate sauce and whipped cream on top,” he repeated.

“And a cherry,” she said.

“And a cherry on top.”

“Please write it down,” she said. “I know you’ll forget.”

“I won’t forget,” he insisted. “Vanilla with chocolate sauce, whipped cream, and a cherry on top.”

The husband went off and returned after a while with a paper bag, which he handed to his wife in bed. She opened up the bag, and pulled out a ham sandwich.

“I told you to write it down,” she said. “You forgot the mustard.”

It seems clear enough that Reagan was increasingly bothered by personal memory lapses. In a regular White House checkup late in his second term, the President began by joking to his doctor, “I have three things that I want to tell you today. The first is that I seem to be having a little problem with my memory. I cannot remember the other two.”

Did Reagan have Alzheimer’s disease in office? Yes and no. Without a doubt, he was on his way to getting the disease, which develops over many years. But it is equally clear that there was not yet nearly enough decline in function to support even a tentative diagnosis. Reagan’s mind was well within the realm of normal functioning. Even if his doctors had been looking intently for Alzheimer’s, it is still likely that they would not have been able to detect the disease-in-progress. A slight deterioration of memory is so common among the elderly that even today it is considered to be a natural (if unwelcome) consequence of aging. About a third to a half of all human beings experience some mild decline in memory as they get older, taking longer to learn directions, for example, or having some difficulty recalling names or numbers.

Alzheimer’s disease overtakes a person very gradually, and for a while can be indistinguishable from such mild memory loss. But eventually the forgetting reaches the stage where it is quite distinct from an absentminded loss of one’s glasses or keys. Fleeting moments of almost total confusion seize a person who is otherwise entirely healthy and lucid. Suddenly, on a routine drive home from work, an intersection he has seen a thousand times is now totally unfamiliar. Or he is asking about when his son is coming back from his vacation, and his wife says: “What do you mean? We both spoke to him last night.” Or he is paying the check after a perfectly pleasant night out and it’s the strangest thing, but he just cannot calculate the 20 percent tip.

The first few slips get chalked up to anxiety or a lousy night’s sleep or a bad cold. But how to consider these incidents of disorientation and confusion when they begin to occur with some frequency? What begin as isolated incidents start to mount and soon become impossible to ignore. In fact, they are not incidents; collectively, they are signs of a degenerative condition. Your brain is under attack. Months and years go by. Now you are losing your balance. Now you can no longer make sense of an analog clock. Now you cannot find the words to complain about your food. Now your handsome young husband has disappeared and a strange elderly man has taken his place. Why is someone taking your clothes off and pouring warm water over you? How long have you been lying in this strange bed?

By 1992, the signs of Reagan’s illness were impossible to ignore. At the conclusion of a medical exam in September, as the New York Times would later report, Reagan looked up at his doctor of many years with an utterly blank face and said, “What am I supposed to do next?” This time, the doctor knew that something was very wrong.

Sixteen months later, in February 1994, Reagan flew back to Washington, D.C., from his retirement home in Bel Air, California, for what would turn out to be his final visit. The occasion was a dinner celebrating his own eighty-third birthday, attended by Margaret Thatcher and twenty-five hundred other friends and supporters.

Before the gala began, the former President had trouble recognizing a former Secret Service agent whom he had known well in the White House. This didn’t come as a total shock to his wife, Nancy, and other close friends, but it did cause them to worry that Reagan might have problems with his speech that night.

The show went on as planned. After an introduction by Thatcher, Reagan strolled to the podium. He began to speak, then stumbled, and paused. His doctor, John Hutton, feared that Reagan was about to humiliate himself. “I was holding my breath, wondering how he would get started,” Hutton later recalled, “when suddenly something switched on, his voice resounded, he paused at the right places, and he was his old self.”

Back at his hotel after the dinner, Reagan again slipped into his unsettling new self, turning to Nancy and saying, “Well, I’ve got to wait a minute. I’m not quite sure where I am.” Though the diagnosis and public announcement were both months away, Reagan was already well along the sad path already trod by his mother, his brother, and by Auguste D.

The doctors who diagnosed Reagan in 1994 knew with some specificity what was happening to his brain. Portions of his cerebral cortex, the thin layer of gray matter coating the outside of his brain, were becoming steadily clouded with two separate forms of cellular debris: clumpy brown spherical plaques floating between the neurons, and long black stringy tangles choking neurons from inside their cell membranes. As those plaques and tangles spread, some neurons were losing the ability to transmit messages to one another. Levels of glucose, the brain’s sole energy source, were falling precipitously, weakening cell function; neurotransmitters, the chemicals that facilitate messages between the neurons, were becoming obstructed. The tangles in some areas of the brain were getting to be so thick it was like trying to kick a football through a chain-link fence.

Ultimately, many of the neurons would die, and the brain would begin to shrink. Because the brain is highly specialized, the strangulation of each clump of neurons would restrict a very specific function—the ability to convert recent events into reliable memories, for example, or the ability to recall specific words, or to consider basic math problems. Or, eventually, to speak at all, or recognize a loved one. Or to walk or swallow or breathe.

We know about plaques and tangles because of Auguste D. and Alois Alzheimer. After four and a half years in the hospital, Frau D. died on April 8, 1906. Her file listed the cause as “septicaemia due to decubitis”—acute blood poisoning resulting from infectious bed sores. In her last days, she had pneumonia, inflammation of the kidneys, excessive fluid in the brain, and a high fever. On the day of her death, doctors understood no more than they had on the first day she was admitted. They could say only this about Auguste D.: that a psychic disturbance had developed in the absence of epileptic fits, that the disturbance had progressed, and that death had finally intervened.

Alois Alzheimer wanted to learn more. He wanted to look at her brain.

Standing apart from most doctors at the time, Alzheimer was equally interested in both clinical and laboratory work. He was known for his tireless schedule, his devoted teaching, and his own brand of forgetfulness. An inveterate smoker, he would put a half-smoked cigar down on the table before leaning into a student’s microscope for a consultation. A few minutes later, while shuffling to the next microscope, he’d light a fresh cigar, having forgotten about the smoke already in progress. At the end of each day, twenty microscopes later, students recalled, twenty cigar stumps would be left smoldering throughout the room.

But Alzheimer did not forget about the woman who had lost herself in Frankfurt. Though he had since moved to the Royal Psychiatric Clinic, in Munich, to work for the renowned psychiatrist Emil Kraepelin, he sent for Frau D.’s central nervous system as soon as she died. Her brain, brainstem, and spinal cord were gently removed from the elaborate bone casing, that flexible yet durable wrapper that allows us all to crouch, twist, and bump into things without much concern. The exposed contents were then likely wrapped in formalin-soaked towels, packed carefully in a wooden crate, and shipped by locomotive 190 miles southeast to Munich.

Imagine, now, that lifeless brain on a passenger train. A coconut-sized clump of grooved gelatinous flesh; an intricate network of prewired and self-adapting mechanisms perfected over more than a billion years of natural selection; powered by dual chemical and electrical systems, a machine as vulnerable as it is complex, designed to sacrifice durability for maximal function, to burn brightly—a human brain is 2 percent of the body’s weight but requires 20 percent of its energy consumption—at the cost of impermanence. Enormously powerful and potato-chip fragile at the same time, the brain is able to collect and retain a universe of knowledge and understanding, even wisdom, but cannot hold on to so much as a phone number once the glucose stops flowing. The train, an elementary device by comparison, can, with proper maintenance, be sustained forever. The brain, which conceived of the train and all of its mechanical cousins, cannot. It is ephemeral by design.

But there was nothing in the brain’s blueprint about this sort of thing, as far as Alzheimer could infer. This was a flaw in the design, a molecular glitch, a disease process, he suspected, and it was important to see what that process looked like up close.

It was also now actually possible to do this for the first time, thanks to a whirl of European innovation. Ernst Leitz and Carl Zeiss had just invented the first distortion-free microscopes, setting a standard in optics that survives today. Franz Nissl had revolutionized tissue-staining, making various cell constituents stand out, opening up what was characterized as “a new era” in the study of brain cells and tissues. (The “Nissl method” is still in use. Nissl, a close collaborator and friend of Alois Alzheimer, became a medical school legend with his instructions on how to time the staining process. “Take the brain out,” he advised. “Put it on the desk. Spit on the floor. When the spit is dry, put the brain in alcohol.”)

Dr. Alzheimer’s assistants prepared for microscopic examination more than 250 slides from slivers of the outer lining (the meninges) of Frau D.’s brain; from the large cerebral vessels; from the frontal, parietal, and occipital areas of the cerebral cortex (locus of conscious thought); from the cerebellum (regulator of balance, coordination, gait) and the brainstem (breathing and other basic life functions); and from the spinal cord, all chemically preserved in a cocktail of 90 percent alcohol/10 percent formalin, and stained according to a half-dozen recipes of Alzheimer’s contemporaries.

Having fixed, frozen, sliced, stained, and pressed the tissue between two thin pieces of glass, Alzheimer put down his cigar and removed his pince-nez, leaned into his state-of-the-art Zeiss microscope, and peered downward. Then, at a magnification of several hundred times, he finally saw her disease.

It looked like measles, or chicken pox, of the brain. The cortex was speckled with crusty brown clumps—plaques—too many to count. They varied in size, shape, and texture and seemed to be a hodgepodge of granules and short, crooked threads, as if they were sticky magnets for microscopic trash.

The plaques were nestled in amongst the neurons, in a space normally occupied by supporting tissue known as glial cells. They were so prominent that Alzheimer could see them without any stain at all, but they showed up best in a blend of magenta red, indigo carmine, and picric acid. Alzheimer had squinted at thousands of brain slides, but he found these clumps “peculiar” and had no idea what they could be.

A different stain, invented just four years earlier, revealed the other strange invasion of Auguste D.’s brain. In the second and third layers of the cortex, nearly a third of the neurons had been obliterated internally, overrun with what Alzheimer called “a tangled bundle of fibrils”—weedy, menacing strands of rope bundled densely together.

The tangles were just as foreign to Alzheimer as the plaques, but at least the ingredients looked familiar. They seemed to be composed of fibrils, an ordinary component of every neuron. It was as if these mild-mannered, or “Jekyll,” fibrils had swallowed some sort of steroidal toxin and been transformed into “Hyde” fibrils, growing well out of proportion and destroying everything within their reach. Many affected neurons were missing a nucleus completely, and most of the rest of their cell contents. A good portion of the neurons in the upper cell layers of the cortex had disappeared. They just weren’t there. Alzheimer’s assistant Gaetano Perusini wrote of the neurofibrillary tangles in Frau D.’s brain:

It is impossible to give a description of all the possible pictures: there are present all the variable and twisted formations that one can imagine; at times large fibrils seem to lie only on the periphery of the cell. But on focusing untangled fibrillar agglomerations are found. Changing the focus again one has the impression that the single dark-coloured fibrils unwind into an infinite number of thinner fibrils … arranged as balls of twine or half-moons or baskets.

Connecting a camera lucida to the top of the microscope, Alzheimer and Perusini both drew pictures of the tangles.

The menacing drawings perfectly convey the ghastly significance of their discovery. Here was the evidence that Auguste D. had not lost herself. Rather, her “self” was taken from her. Cell by cell by cell, she had been strangled by unwelcome, malignant intruders.

What were they, exactly, and where did they come from?



When my kids began to say they were worrying about my memory, I said to them, “Well, I’ve never had a photographic memory, and I have a lot more on my mind now. There’s a lot more to remember with life being so complex. How can I remember everything? What do you want—total recall?” I always had an answer. I really was in denial, and it just didn’t occur to me that I had a problem. But I also knew that they weren’t totally exaggerating.

—D.

New York, New York




Chapter 2 BOTHERED (#ulink_20db919d-017e-56ef-8aca-6453893dfa4b)


Queens, New York: August 1998

It was lunch time in Freund House, in the village of Flushing. A small group of elderly Jews sat quietly at a round table. Not much was said as they ruffled open their brown paper bags and popped the lids off drinks. Someone brought in a big bottle of ginger ale and some plastic cups, and offered to pour.

Irving looked over at Greta and noticed that she was sitting still, her hands folded together on the bright red table cloth.

“Did you bring your lunch today, Greta?”

“I don’t think so. I usually don’t bring my lunch here.”

“Yes, you do. You bring cereal.”

Irving waited for Greta to recollect her routine, but she could not. An elegant, shrunken woman with short cropped hair, dark eyebrows, and a supple, leathery face, Greta did not look even remotely like someone in decline. Her eyes still sparkled and her voice had spunk. She spoke without hesitation and in full, clear sentences. There was no clue from her cadences that her brain was under attack.

Paying close attention, though, one could tell that something was not right. For example, in a conversation about Japan, Greta very clearly explained that she had been there a number of times. She discussed the temples of Kyoto, which she enjoyed, and the food, which she did not.

Then, about an hour later, the subject of Japan came up again. This time, she said matter-of-factly, “Japan—never did get there. Couldn’t get in.”

These hiccups in logic were typical, I now recognized, of someone beginning to advance past the very earliest stages of the disease. She wasn’t very far along yet, and most of her brain was still working quite well; but her symptoms were no longer strictly limited to the classic short-term memory loss that usually signals the disease’s onset. Occasionally, now, a queer incongruity would creep in.

Standing off to one side of the table was Judy Joseph, the co-leader, with Irving Brickman, of this support group. About a year earlier she had been introduced to Irving in the New York offices of the Alzheimer’s Association, where each had come to see what, if anything, could be done about this ominous new social phenomenon. Suddenly, it seemed, Alzheimer’s disease was everywhere. Nursing home dementia units were filling beyond capacity. Middle-aged children were moving back home to take care of their parents. Community police were regularly being phoned to help track down wandering relatives. The disease was cropping up continually in newspaper articles and everyday conversation. Perhaps most tellingly, a vibrant Alzheimer’s consumer market was springing up—products like automatic medication dispensers (no memory required!), wireless tracking devices for wanderers, and even a Stovetop fire extinguisher designed explicitly for people who might forget to turn off the range.

All of a sudden, everyone seemed to know someone touched by Alzheimer’s. Partly, this was due to a shift in public conception of senile dementia. Only in the mid-1970s had doctors started to realize that senility is not an inevitable process of brain aging and decay but a recognizable—and perhaps one day treatable—disorder. Gradually, this perception also started to seep into the general consciousness: Senility is a disease.

Since then, there had been a staggering rise in actual cases of Alzheimer’s, corresponding to a vast increase in the elderly population. People were now living much longer lives. Longer lives meant more cases of Alzheimer’s. Since 1975, the estimated number of Alzheimer’s cases in the U.S. had grown tenfold, from 500,000 to nearly 5 million. Worldwide, the total was probably about three times that figure. In the absence of a medical breakthrough, the gloomy trend would not only continue, but would also get much, much worse.

The Roman poet Virgil wrote in the first century B.C., “Time wastes all things, the mind, too.” He was partly right. Scientists do not believe that Alzheimer’s is an inevitable consequence of aging. Many people will never get the disease regardless of how long they live. But aging is by far the greatest risk factor. It is almost unheard of in people aged 20–39, and very uncommon (about one in 2,500) for people aged 40–59. For people in their sixties, the odds begin to get more worrisome. An estimated and so on have Alzheimer’s or a closely related dementia. The risk accelerates with age, to the point where dementia affects nearly half of those eighty-five and over.

• 1 percent of 65-year-olds

• 2 percent of 68-year-olds

• 3 percent of 70-year-olds

• 6 percent of 73-year-olds

• 9 percent of 75-year-olds

• 13 percent of 77-year-olds

So, as the twentieth century came to a close, a shadow legacy was rapidly becoming apparent—the dark, unintended consequence of the century’s great advances in hygiene, nutrition, and medicine. Life spans in industrialized nations had nearly doubled over the previous one hundred years, and the percentage of elderly among the general population had more than tripled. In the process, the number of cases of senile dementia mushroomed. A hundred years before, it had not even been a statistical blip. Paradoxically, in the full blush of medical progress of the twentieth century, it had blossomed into a major public health problem.

Most strikingly to social workers like Judy and Irving, the number of people who had Alzheimer’s and who knew they had Alzheimer’s had exploded. A huge portion of the newly diagnosed cases were in the very early stages of the disease. “This is something new in the field,” Irving explained. “Most people never before realized that there is an early stage of Alzheimer’s. I had worked with the more advanced stages, but when I came into this it was overwhelming for me. It’s very hard to get used to a normal person who happens to have dementia. It’s a whole different ballgame.”

Judy and Irving recognized, along with many others in the national Alzheimer’s community, that something had to be done to help this emerging new constituency: early-stage dementia sufferers still functioning well enough to fully understand what lay ahead. With the assistance of the Alzheimer’s Association, they formed a support group at Freund House. “Our goal,” explained Irving, “is to try to help these people live a quality life, to help them gain some coping mechanisms for their deficits, and to help them feel better as human beings.” While scientists did battle with this disease, victims and their families had the opposite task: to make a certain peace with it, to struggle to understand the loss, come to terms with it, create meaning out of it.

Alzheimer’s is what doctors call a disease of “insidious onset,” by which they mean that it has no definitive starting point. The plaques and tangles proliferate so slowly—over decades, perhaps—and silently that their damage can be nearly impossible to detect until they have made considerable progress. Part of the function of any early-stage support group must be to try to make sense of this strange new terrain that lies between healthy and demented. Where, in specific behavioral terms, is the person overshadowed by the disease?

Individually and collectively, the Freund House group was trying to find out, and to make sense of the answer. “My wife gets frustrated with me,” Arnie related to his fellow group members, “and she is right to be frustrated. She asks me to put a can in the recycling … and I don’t do it. She says, ‘I know this is because of your illness, that this is not you.’”

Sadie nodded her head in recognition. “My mother had this, too,” she said. “Now I know what it was like for my father to take care of her. We used to get so mad at him when he would be short with her.”

Coping with a particular disability was one thing; trying to cope with an ever-shifting invisible illness, though, was a challenge unique to Alzheimer’s disease. In this early period, the insidiousness itself was often the most troubling thing about the disease—arguably even a disease unto itself As a group, these new patients could gain a more confident understanding of their disease, and tackle issues that would seem impossibly difficult to one isolated, failing person.

Driving, for instance. The first big question they confronted right after forming the group was: Should they continue, in this blurry period of semi-normalcy, to pilot massive steel boxes at thirty and forty and fifty miles per hour down roads lined with bicycles and toddlers? Studies showed conclusively that Alzheimer’s is, overall, a major driving hazard. Bystanders had been killed by Alzheimer’s patients making a lapse in judgment or being overcome momentarily by confusion. But the law had not yet caught up with this reality. Even with a diagnosis, no doctor or judge had ever confiscated a license. Families were forced to decide on their own when driving was no longer appropriate.

Together, after much deliberation, the group decided that it had already become too dangerous. Collectively, they gave up this highly charged symbol of autonomy and competence. On this shaky new terrain, a person’s independence could no longer be taken for granted.

In the summer of 1984, at the age of eighty-five, E. B. White, the tender essayist and author of Charlotte’s Web, became waylaid by some form of dementia. It came on very swiftly. In August, he began to complain of some mild disorientation. “We didn’t pay much attention,” recalls his stepson, Roger Angell, “because he was a world-class hypochondriac.” But just a few weeks later. White was severely confused much of the time. By the following May, he was bedridden with full-on dementia, running in and out of vivid hallucinations and telling visitors, “So many dreams—it’s hard to pick out the right one.” He died just a few months after that, in October 1985.

An obituary in the New York Times reported White as having Alzheimer’s disease, but that appeared to miss the mark. In fact, he was never even informally diagnosed with the disease, and his symptoms strongly suggested another illness. The rapid onset of the confusion and the abrupt shift from one stage to the next were classic signs of multi-infarct dementia, the second-most common cause (15 percent) of senile dementia after Alzheimer’s (60 percent). Multi-infarct dementia is caused by a series of tiny strokes. Its victims can have much in common with those of Alzheimer’s, but the experience is not as much of an enigma. Its cause is known, somewhat treatable, and, to a certain extent, preventable (diet, exercise, and medication can have an enormous impact on risk of strokes). Its jerky, stepwise approach is easier to follow and understand as symptoms worsen.

Alzheimer’s disease is not abrupt. It sets in so gradually that its beginning is imperceptible. Creeping diseases blur the boundaries in such a way that they can undermine our basic assumptions of illness. Alzheimer’s drifts from one stage to the next in a slow-motion haze. The disease is so gradual in its progression that it has come to be formally defined by that insidiousness. This is one of the disease’s primary clinical features, one key way that Alzheimer’s can be distinguished from other types of dementia: those caused by strokes, brain tumor, underactive thyroid, and vitamin deficiency or imbalance in electrolytes, glucose, or calcium (all treatable and potentially reversible conditions).

It is also nearly impossible to officially diagnose. A definitive determination requires evidence of both plaques and tangles—which cannot be obtained without drilling into the patient’s skull, snipping a tiny piece of brain tissue, and examining it under a microscope. Brain biopsies are today considered far too invasive for a patient who does not face imminent danger. Thus—Kafka would have enjoyed this—as a general rule, Alzheimer’s sufferers must die before they can be definitively diagnosed. Until autopsy, the formal diagnosis can only be “probable Alzheimer’s.”

These days, a decent neuropsychologist can maneuver within this paradox—can make a diagnosis of probable Alzheimer’s with a confidence of about 90 percent—through a battery of tests. The process almost always begins with this simple quiz:

What is today’s date?

What day of the week is it?

What is the season?

What country are we in?

What city?

What neighborhood?

What building are we in?

What floor are we on?

I’m going to name three objects and I want you to repeat them back to me: street, banana, hammer.

I’d like you to count backwards from one hundred by seven. [Stop after five answers.]

Can you repeat back to me the three objects I mentioned a moment ago?

[Points at any object in the room.] What do we call this?

[Points at another object.] What do we call this?

Repeat after me: “No ifs, ands, or buts.”

Take this piece of paper in your right hand, fold it in half, and put it on the floor.

[Without speaking, doctor shows the patient a piece of paper with “CLOSE YOUR EYES” printed on it.]

Please write a sentence for me. It can say anything at all, but make it a complete sentence.

Here is a diagram of two intersecting pentagons. Please copy this drawing onto a plain piece of paper.

This neurological obstacle course is called the Mini Mental State Examination (MMSE). Introduced in 1975, it has been a part of the standard diagnostic repertoire ever since. The MMSE is crude but generally very effective in detecting problems with time and place orientation, object registration, abstract thinking, recall, verbal and written cognition, and constructional praxis. A person with normal functioning will score very close to the perfect thirty points (I scored twenty-nine, getting the date wrong). A person with early-to-moderate dementia will generally fall below twenty-four.

The very earliest symptoms in Alzheimer’s are short-term memory loss—the profound forgetting of incidents or conversations from just a few hours or the day before; fleeting spatial disorientation; trouble with words and arithmetic; and some impairment of judgment. Later on, in the middle stages of the disease, more severe memory problems are just a part of a full suite of cognitive losses. Following that, the late stages feature further cognitive loss and a series of progressive physical disabilities, ending in death.

One brilliantly simple exam, the Clock Test, can help foretell all of this and can enable a doctor to pinpoint incipient dementia in nine out of ten cases. In the Clock Test, the doctor instructs the patient to draw a clock on a piece of paper and then draw hands to a certain time. Neurologists have discovered that patients in the early stages of dementia tend to make many more errors of omission and misplacing of numbers on the clock than cognitively healthy people. They’re not entirely sure why this is, but the accuracy of the test speaks for itself.

A battery of other performance tests can help highlight and clarify neurological deficiencies. The Buschke Selective Reminding Test measures the subject’s short-term verbal memory. The Wisconsin Card Sorting Test gauges the ability to deduce sorting patterns. In the Trail Making Test, psychomotor skills are measured by timing a subject’s attempt to draw a line connecting consecutively numbered circles. Porteus Mazes measure planning and abstract-puzzle-solving ability.

If the patient performs poorly in a consistent fashion, the next step will likely involve elaborate instruments. Conveniently for physicians, Alzheimer’s disease always begins in the same place: a curved, two-inch-long, peapod-like structure in the brain’s temporal lobes called the hippocampus (the temporal lobes are located on either side of the head, inward from the ear). Doctors can get a good look at the hippocampus with a magnetic resonance imaging (MRI) scanner, which bombards the body with radio waves and measures the reflections off tissue. A simple volume measurement of the hippocampus will often show, even in the very early stages of Alzheimer’s, a pronounced decrease in volume, particularly in contrast with other brain structures. By itself, the MRI cannot diagnose Alzheimer’s. But it can add one more helpful piece to the diagnostic puzzle.

Other advanced measurements might also help: A positron emission tomography (PET) scan may detect a decrease in oxygen flow or glucose metabolism in the same area. A single photon emission computed tomography (SPECT) scan may catch decreases in blood flow. A moderate to severe amount of slowing in the alpha rhythm in an electroencephalogram (EEG) is often characteristic of dementia. But such measurements are generally not required for a tentative diagnosis. In the face of convincing results from memory and performance tests, and in the absence of any contravening evidence—disturbance in consciousness, extremely rapid onset of symptoms, preponderance of tremors or other muscular symptoms, difficulties with eye movements or reports of temporary blindness, seizures, depression, psychosis, head trauma, a history of alcoholism or drug abuse, any indication of diabetes, syphilis, or AIDS—a diagnosis of probable Alzheimer’s is rendered.

Alzheimer’s disease. The diagnosis is a side-impact collision of overwhelming force. It seems unreal and unjust. After coming up for air, the sufferer might ask, silently or out loud, “What have I done to deserve this?” The answer is, simply, nothing. “I remember walking out of the clinic and into a fresh San Diego night feeling like a very helpless and broken man,” recalled Bill, a fifty-four-year-old magazine editor, to writer Lisa Snyder. “I wondered if there was anything for me to live for.”

It can take a while to sink in. Experienced doctors know not to try to convey any other important information to a patient or family member on the same day that they disclose the diagnosis. They put some helpful information into a letter, and schedule a follow-up.

There is no cure for Alzheimer’s at the present time, and not much in the way of treatment. Historically, the one saving grace of the disease over the years has been that many, if not most, of the people who acquire the disease do not comprehend what is about to happen to them and their families. Now, for better or worse, that has changed. More and more are learning at the earliest possible opportunity what they have, and what it means.

What will they do with the advance knowledge? It is not an easy question. Will they use the time left to get their affairs in order and to prepare themselves emotionally for the long fade? Or will the knowledge only add to the frustration and force them into a psychological spiral to accompany the physiological one?

The Freund House early-stage support group was one experimental approach to tackling such unknowns. When Judy and Irving created it in 1997, they weren’t sure it would work. Could people struggling with memory loss, spatial disorientation, and confusion actually strike up a meaningful relationship with a group of strangers? They had to assemble just the right team. “We had to turn many people away,” said Judy, “because we didn’t feel they were right for a support group. They weren’t introspective enough. They weren’t bothered enough.”

The group was also temporary by design. As participants lost the ability to contribute, they would be eased out of the group, and perhaps admitted to a middle-stage group like the one that Judy ran down the hall. In that group, volunteer caregivers always accompanied patients to the restroom and back, because otherwise they would get lost. Most, not all, still responded to their own name. After a cafeteria-style lunch, everyone came together in a circle to sing fun songs together, like the theme from Barney:

I love you

You love me

We’re a happy family

Members of the early-stage group occasionally caught a glimpse of the middle-stage group as they passed by to get a cup of coffee. The quiet, desperate hope of everyone in this group was not to end up in the other group. Barring a scientific miracle, though, there would be no avoiding it. The average interval from diagnosis to death in Alzheimer’s disease is eight years.

In the meantime, there were a hundred small consolations. The early-stage group members had quickly come to rely on one another for help through this very strange ordeal. Sometimes barely able to remember from week to week, they had nevertheless become friends. They shared memories of movie stars and kosher butchers. They talked about travel and passed around pictures of grandchildren. They even talked politics.

“Greta, any comments on Giuliani?” Judy asked one afternoon.

Greta swatted an invisible bug away from her face. “Oh don’t get me started about him,” she said. “You know I can’t stand him.”

“Clinton, then? What does everyone think about Monica?”

Opinions ran the gamut. Ted, his hands shaking with a Parkinsonian tremor (it is not unusual for people to suffer from both Parkinson’s and Alzheimer’s), suggested that Clinton should resign because he lied directly to the American people. Greta, a lifelong subscriber to The Nation, thought that Clinton probably kissed Monica but that the whole issue was overblown. Sadie thought it was all a Republican scheme.

Doris had an opinion, too, but with her severe expressive aphasia—an inability to retrieve words—she had great difficulty making it known.

“Gore … President … I think … good leader … lies …”

She appeared to be aware of her thoughts and very clear on what she wanted to say. But the words were no longer accessible. This was especially painful to watch because, as everyone in the group knew by now, Doris had a forty-year-old son with cerebral palsy who was deaf The two were very close, and, as it happened, she was the only one in the family to have ever learned sign language. Now Doris’s aphasia was also wiping away that second and more vital language. She could no longer speak to her son, leaving him marooned.

It was now a few minutes after one o’clock, time to say good-bye for the week. Rides were arranged. Someone went to fetch William’s wife, a volunteer in the middle-stage group.

Robert seemed to be having a hard time of it. Just a moment before, he had been lucidly telling me about his family and his past. He’d had no problem relating how he was spirited out of Nazi Germany as a young boy, turned over to relatives in England and later in New York. I learned all about his children, their occupations and families, the cities they lived in. But now he was struggling to understand a piece of paper his wife had written out for him about getting home. To the undamaged brain, the instructions were fairly straightforward—Robert will be picked up by the car service at 1:15, and should be driven to his home at ___ Street.…—but he was having a lot of trouble making sense of it. Then there was the other problem. In the last half hour, he had told me how he eventually came to live in the Bronx, where he was introduced to his wife, a distant cousin. He had described how crowded that Bronx apartment was, and where else he had lived in the city as he’d grown older. But now, for the life of him, Robert could not remember where he had put his jacket.

It was on the back of his chair.



Very often I wander around looking for something which I know is very pertinent, but then after a while I forget about what it is I was looking for.… Once the idea is lost, everything is lost and I have nothing to do but wander around trying to figure out what it was that was so important earlier. You have to learn to be satisfied with what comes to you.

—C.S.H.

Harrisonburg, Virginia




Chapter 3 THE GOD WHO FORGOT AND THE MAN WHO COULD NOT (#ulink_c2de6709-1920-5d9f-8c2f-773732d22506)


There could be no happiness, cheerfulness, hope, pride, immediacy, without forgetfulness. The person in whom this apparatus of suppression is damaged, so that it stops working, can be compared … to a dyspeptic; he cannot “cope” with anything.

—FRIEDRICH NIETZSCHE

As found in the Pyramid Texts, from 2800 B.C., Ra was the Sun God, the creator of the universe and of all other gods. From his own saliva came air and moisture. From his tears came humankind and the river Nile. He was all-powerful and, of course, immortal—but still not immune to the ravages of time: Ra, the supreme God, became old and senile. He began to lose his wits, and became easy prey for usurpers.

Throughout recorded history, human beings have been celebrating the powers of memory and lamenting its frailties. “Worse than any loss in body,” wrote the Roman poet Juvenal in the first century A.D., “is the failing mind which forgets the names of slaves, and cannot recognize the face of the old friend who dined with him last night, nor those of the children whom he has begotten and brought up.”

It took several thousand years, though, for anyone to figure out how memory actually worked. Plato was among the first to suggest a mechanism. His notion was of a literal impression made upon the mind. “Let us suppose,” he wrote, “that every man has in his mind a block of wax of various qualities, the gift of Memory, the mother of the Muses; and on this he receives the seal or stamp of those sensations and perceptions which he wishes to remember. That which he succeeds in stamping is remembered and known by him as long as the impression lasts; but that, of which the impression is rubbed out or imperfectly made, is forgotten, and not known.”

Later came the ventricular theory of cognition, from Galen (129 – ca. 199 A.D.), Nemesius (fourth century), and St. Augustine (354–430). According to this notion, the three major functions of the brain—sensation, movement, and memory—were governed from three large, round fluid-filled sacs. Vital Spirit, a mysterious substance that also contained the human soul, was harbor to the swirl of memories.

From this model came cerebral localization, the theory that the various functions of the brain were each controlled by specialized “modules.” This model of specialization turned out to be generally correct (if radically different in the details from what Galen had imagined). In the early twentieth century, it emerged that the brain wasn’t really an organ so much as a collection of organs, dozens of structures interacting with one another in dazzling complexity. Deep in the center of the brain the amygdala regulates fear while the pituitary coordinates adrenaline and other hormones. Visual stimulus is processed in the occipital lobe, toward the rear of the skull. Perception of texture is mediated by Area One of the parietal lobe near the top of the head, while, just to the rear, the adjacent Area Two differentiates between the size and shape of objects and the position of joints. The prefrontal cortex, snuggled just behind the forehead, spurs self-determination. Broca’s area, near the eyes, enables speech. Wernicke’s area, above the ears, facilitates the understanding of speech.

The more researchers discovered about localization, though, the more they wondered about the specialized zone for memory. Where was it? If vision was in the back of the brain, texture on top, and so on, what region or regions controlled the formation of lasting impressions and the retrieval of those impressions?

Part of the answer came in 1953, when a Harvard-trained neurosurgeon named William Beecher Scoville performed experimental surgery on a twenty-seven-year-old patient known as H.M. He had been suffering from violent epileptic seizures since childhood, and in a last-ditch effort to give him a chance at a normal life, Scoville removed a small collection of structures, including the hippocampus, from the interior portion of his brain’s two temporal lobes. The surgery was a great success in that it significantly reduced the severity of H.M.’s epilepsy. But it was also a catastrophe in that it eliminated his ability to lay down new memories. The case revolutionized the study of memory, revealing that the hippocampus is essential in consolidating immediate thoughts and impressions into longer-lasting memories (which are in turn stored elsewhere).

Time stopped for H.M. in 1953. For the rest of his long life, he was never again able to learn a new name or face, or to remember a single new fact or thought. Many doctors, researchers, and caregivers got to know him quite well in the years that followed, but they were still forced to introduce themselves to him every time they entered his room. As far as H.M. was concerned, he was always a twenty-five-year-old man who was consulting a doctor about his epilepsy (he had also lost all memory of the two years immediately prior to the surgery). H.M. became perhaps the most important neurological subject in history and was subject to a vast number of studies, but he remembered none of the experiments once they were out of his immediate concentration. He was always in the Now.

In the clinical lexicon, this was a perfect case of anterograde amnesia, the inability to store any new memories. Persons with incipient Alzheimer’s disease exhibit a slightly less severe form of the same problem. The memory of leaving the car keys in the bathroom isn’t so much lost as it was never actually formed.

In a healthy brain, sensory input is converted into memory in three basic stages. Before the input even reaches consciousness, it is held for a fraction of a second in an immediate storage system called a sensory buffer.

Moments later, as the perception is given conscious attention, it passes into another very temporary system called short-term (working) memory. Information can survive there for seconds or minutes before dissolving away.

Some of the information stirring in working memory is captured by the mechanism that very slowly converts into a long-term memory lasting years and even a lifetime.

Long-term memories can be either episodic or semantic. Episodic memories are very personal memories of firsthand events remembered in order of occurrence. Before the baseball game the other day, I put on my new pair of sneakers, which I had gotten earlier that morning. Then we drove to the stadium. Then we parked. Then we gave the man our tickets. Then we bought some hot dogs. Then we went to our seats …

Now, days later, if I notice a mustard stain on my shoe, I can plumb my episodic memory to determine when and how it happened. If my feet start bothering me, my episodic memory will help me figure out whether it happened before or after I bought my new shoes.

Semantic memories are what we know, as opposed to what we remember doing. They are our facts about the world, stored in relation to each other and not when we learned them. The memory of Neville Chamberlain’s “peace in our time” is semantic.

They are separate systems—interrelated, but separate. An early-stage Alzheimer’s patient who cannot retain memories of where she put her keys has not forgotten what keys are for, or what kind of car she drives. That will come much, much later, when she starts to lose old semantic memories.

The experience with H.M. taught researchers that the hippocampus is key to long-term memory formation. Without that tiny organ, he was totally incapable of forming new, lasting memories. Alzheimer’s patients suffer the exact same systemic loss, but over several years rather than one surgical afternoon. For H.M., there were no new memories after 1953, period. In later years, he was unable to recognize his own face in the mirror. Real time had marched on, 1955 … 1962 … 1974, but as far as he was concerned, he was still twenty-five years old. If you are a young man, alert and intelligent, and you look into an ordinary mirror only to discover the face of a sixty-year-old perfectly mimicking your expressions, perhaps only then do you know the real meaning of the word horror. Fortunately, the extreme distress H.M. suffered during such world-shattering incidents was always immediately and completely forgotten as soon as his attention could be distracted by something happening in the new moment. Not remembering can sometimes be a great blessing.

The discovery of hippocampus-as-memory-consolidator was critical. What memory specialists have been trying to figure out ever since then is, once formed, where do these long-term memories actually reside? Are memories stored up in the front of the brain in the prefrontal cortex? On top, in the parietal lobe? In the brainstem at the base of the brain? Where?

One tantalizing theory emerged in the late 1950s: memories were everywhere, stored in discrete molecules scattered throughout the brain. A stampede to confirm this notion was set off by a 1962 Journal of Neuropsychiatry article, “Memory Transfer Through Cannibalism in Planaria,” in which the University of Michigan’s James McConnell eagerly reported that worms could capture specific memories of other worms simply by eating those worms. McConnell had trained a group of flatworms to respond to light in a noninstinctive way. He then killed these worms, chopped them up, and fed them to untrained flatworms. After eating their brethren, McConnell claimed, the untrained worms proceeded to behave as though they had been trained—they had somehow acquired the memory of the trained worms. It was the unexpected apotheosis of the old saying, “You are what you eat.”

Out of this report numerous research grants were born, some of which yielded tantalizing results. Three years after McConnell’s initial study, four California scientists reported in the journal Science that when cells extracted from the brains of trained rats were injected into the guts of untrained rats, the untrained rats picked up the learned behavior of the trained rats. These experiments apparently showed that specific, concrete individual memories were embedded as information in discrete molecules in the same way that genetic information is embedded in DNA, and that these memories were transferable from brain to brain. A later experiment by Baylor University’s Georges Ungar was the most vivid yet: Brain cells from rats that had been trained to fear the dark were transferred to untrained mice (ordinarily, neither mice nor rats fear the dark), who suddenly took on this new fear. Ungar even isolated a peptide comprising fifteen amino acids that he said contained the newly created memory. He called the transmissible fear-of-the-dark memory molecule scotophobin.

The theory that emerged out of these experiments was of memory as a distinct informational molecule that could be created organically in one brain, isolated, and then transferred to another brain—even to the brain of another species. Its implications were immense. Had this cold fusion of an idea been validated rather than widely discredited not long after Ungar’s paper was published in Nature in 1972, it is clear that ours would be a very different world today: Memory swaps. Consciousness transfers. Neurochemical behavioral enhancements that would make Prozac seem like baby aspirin. The rapid decoding of a hidden science of memory molecules might well have spawned a new type of biochemical computer that could register, react to, and even create memory molecules of its own. Laptops (or cars or stuffed animals) could be afraid of the dark or partial to jazz or concerned about child abuse. Memories and feelings could be bottled and sold as easily as perfume.

But that world did not, and cannot, emerge. The memory transfer experiments, while entertaining and even seductive—DNA pioneer Francis Crick was among the many prestigious scientists on board for a while—were ultimately dismissed as seriously misguided). The idea of transferable memories strained credulity to begin with; to suggest that one animal’s specific fear could travel through another animal’s digestive tract, enter its bloodstream, find its way to the brain, and turn itself on again in the new host mind was an even further stretch.

And then there was the problem of physical mass. Skeptics calculated that if specific memories were contained in molecules the way Ungar suggested, the total number of memories accumulated over a lifetime would weigh somewhere in the vicinity of 220 pounds. The brain would literally be weighed down by thought and ideas.

After a decade or so, the notion and burgeoning industry of memory molecules crumbled into dust. It is now one particularly humiliating memory that many neuroscientists would just as soon not retain. What has grown up out of that rubble over the last thirty years is a very different understanding of memory—not as a substance but as a system. Memories are scattered about; that part the memory molecularists had right. Memory is everywhere. But it is everywhere in such a way that it is impossible to point to any one spot and identify it with an explicit memory. We now know that memory, like consciousness itself, isn’t a thing that can be isolated or extracted, but a living process, a vast and dynamic interaction of neuronal synapses involved in what Harvard’s Daniel Schacter elegantly terms “a temporary constellation of activity.” Each specific memory is a unique network of neurons from different regions of the brain coordinating with one another. Schacter explains:

A typical incident in our everyday lives consists of numerous sights, sounds, actions, and words. Different areas of the brain analyze these various aspects of an event. As a result, neurons in the different regions become more strongly connected to one another. The new pattern of connections constitutes the brain’s record of the event.

The power of the constellation idea is reinforced by the understanding of just how connected the 100 billion neurons in the brain actually are. A. G. Cairns-Smith, of the University of Glasgow, observes that no single brain cell is separated from any other brain cell by more than six or seven intermediaries.

The molecular basis for these synaptic constellations that can be reignited again and again (though never in precisely the same configuration), is a biochemical process called long-term potentiation (LTP) that intensifies the affinity between specific neurons after a significant connection is made. Think of an ant farm, with worker ants constantly building new tunnels among one another; once a tunnel is built, transport becomes many times easier; an easy, natural connection has been created between those two points. With memory formation and retrieval, pathways are at first built and later simply used. Each notable experience causes a unique set of neurons to fire in conjunction with one another. As a result, those connections become chemically more sensitive to one another so that they can more easily trigger each other again. With that unique constellation of synapses, one has created a permanent physical trace of the original sensation. Neurologists call these memory traces “engrams.”

The ant farm analogy also applies in another important way: Neurobiologists have found that memory formation is slow. Long-term memories can take many months or even years to fully form.

Long-term memories are durable, but not unassailable. They can last a lifetime, but from the first moments are subject to influences from other memories and experience. Inevitably, as they age and are evoked again and again, all memories change in character.

This is part of the brain’s famous plasticity, its ability to adapt to life’s events. Plasticity makes us as much creatures of our own experience as we are products of evolution. Not everything in the brain is adaptable, of course; much of it comes “hard-wired,” genetically preprogrammed to specialize and perform specific tasks such as processing light and sound, regulating heart rate and breathing, and so on. But the regions reserved for fine motor skills, intelligence, and memory are more like soft clay, able to take on a definite shape and yet remain constantly responsive to new stimuli.

Memory constellations, then, are not fixed, immutable collections of memories, but ever-variable collections of memory fragments that come together in the context of a specific conscious moment. Any common free-association experiment is a vivid illustration of this point. For me, at this moment, the word “cat” prompts → a thought of Brownfoot, my boyhood feline friend → the garage roof she used to leap from → the 1971 T-top Corvette my father used to drive → the tragicomic month in which Mom wrecked this car twice → a feeling of malaise associated with my parents’ divorce years later. This instant montage of memories is neither chronological nor predictable, even by me. If someone were to prompt me with “cat” tomorrow, depending on my mood or recent experience, I might think of the cat that my daughter called to yesterday outside our house. Or it could be that Brownfoot will come to mind, but that from there I will shift to an image of my playing her dentist, and then I might think of my own current dentist and how I’m way overdue for a cleaning. That guilty feeling might then trigger another distant idea, related only by a parallel feeling of guilt. And so on.

Taken together, this interconnected universe of constellations in each of us forms the core of who we are. Our life’s ocean full of memory waves wash against one another to create a complex and ever-adapting character.

The director Martin Scorsese is an interesting memory-character study, mostly because he seems to forget very little compared to others. He remembers not just every shot and crew credit from each of the thousands of movies he’s seen, observes the New Yorker’s Mark Singer, but also every detail of every book, song, and personal experience he’s had in fifty-plus years—“all of it,” Singer writes, “seemingly instantly retrievable.”

Singer depicts the Scorsese memory constellation in action. After a colleague criticizes a piece of film dialogue as “too piercing,” Scorsese is instantly thrown into an interconnected memory odyssey:

He was reminded of the old Harry Belafonte calypso tune “The Banana Boat Song”—or, rather, a parody of same by Stan Freberg, which included a reference to “piercing,” and that reminded him of another Freberg routine, a parody of the television series Dragnet, which in turn reminded him of Pete Kelly’s Blues, a feature film directed by Jack Webb, the star of Dragnet. The production designer of Pete Kelly’s Blues, in which Webb played a bandleader during the twenties, was a Disney veteran who brought to it a remarkably vivid palette, a reality-heightening Technicolor glow reminiscent of the live-action Disney children’s films of the forties.… And, Scorsese further recalled, Pete Kelly’s Blues had a screenplay by Richard L. Breen, whose name, curiously, Webb had heralded before the title. When the picture was released, in 1955, the year Scorsese turned thirteen, he followed it from theatre to theatre, as was his habit.… [He then recalled all the specific theaters he used to frequent.] One particular Saturday afternoon double-feature at the Orpheum came to mind: Bomba the Jungle Boy and Great White Hunter.…

The pathways linking engrams can be built on temporal, intellectual, or aesthetic associations, and when the mind really wanders, during daydreams or at night before sleep sets in, it’s amazing what sort of involuntary memory leaps one makes, from impressions that often have no logical or logistical relationship but which share a texture or smell or emotional fragment. What’s more—and this may be the single most important point to understand about memory—every time a memory is recalled, new trails are made.

The act of remembering itself generates new memories. Which means that Emerson was exactly right when he noted in his journal: “Most remembering is only the memory of memories, & not a new & primary remembrance … HDT [Henry David Thoreau] noticed this to me some time ago.” Overlap, in other words, is not only built into the biology of memory. It is the very basis of memory—and identity. New memory traces are laid down on top of a foundation of old memories, and old memories can only be recalled in a context of recent experiences. Imagine a single painting being created over the course of a lifetime on one giant canvas. Every brush stroke coming into contact with many others can be seen only in the context of those prior strokes—and also instantly alters those older strokes. Because of this, no recorded experience can ever be fully distinct from anything else. Whether one likes it or not, the past is always informed by the present, and vice versa.

Scores of experiments confirm the malleability of old memories, and horror stories of False Memory Syndrome are by now widespread. The psychologist Elizabeth Loftus has spent the better part of her career documenting the ease with which false memories can be planted—accidentally or on purpose. Often, these false memories lead to wrongful convictions. In 1979, twenty-two-year-old marine corporal Kevin Green was convicted of second-degree murder for the brutal beating of his wife and the death of their full-term fetus. His wife had testified after coming out of a coma that Green, her own husband, was the attacker. Sixteen years later, the real attacker, a total stranger, confessed to police about that and six other murders. It turned out that Green’s guilt had been suggested to his wife early on in her rehabilitation. By the time it came to trial, she had created a memory so clear that she was able to confidently testify against her husband.

“Eyewitness misidentification … is known as the single greatest cause of the conviction of the innocent,” says attorney Barry Scheck. He describes a typical scenario: “You can have as many as five witnesses who begin in kind of a soft way, saying, ‘That might be the guy,’ and then, like wet concrete hardening, the [memories] get fixed to the point that by the time they get to the courtroom, they’re saying ‘That’s the man.’”

Part of the deep attraction to the idea of distinct memory molecules was that it connoted the ability to replay old memories like videotapes on a VCR—just as they were originally recorded. But the biology of memory constellations dictates that there is no such thing as pure memory. Recall is never replay.

But why? Why would millions of years of evolution produce a machine so otherwise sophisticated but with an apparent built-in fuzziness, a tendency to regularly forget, repress, and distort information and experience?

The answer, it turns out, is that fuzziness is not a severe limitation but a highly advanced feature. As a matter of engineering, the brain does not have any physical limitations in the amount of information it can hold. It is designed specifically to forget most of the details it comes across, so that it may allow us to form general impressions, and from there useful judgments. Forgetting is not a failure at all, but an active metabolic process, a flushing out of data in the pursuit of knowledge and meaning.

We know this not just from brain chemistry and inference, but also because psychologists have stumbled upon a few individuals over the years who actually could not forget enough—and were debilitated by it.

In his New Yorker profile, Mark Singer wonders if Martin Scorsese is such a person—burdened by too good a memory.

Was it, I wondered, painful to remember so much? Scorsese’s powers of recall weren’t limited to summoning plot turns or notable scenes or acting performances; his gray matter bulged with camera angles, lighting strategies, scores, sound effects, ambient noises, editing rhythms, production credits, data about lenses and film stocks and exposure speeds and aspect ratios.… What about all the sludge? An inability to forget the forgettable—wasn’t that a burden, or was it just part of the price one paid to make great art?

For some perspective on the inability to forget, consider the case study that psychologists call S. In the 1920s, S. was a twenty-something newspaper reporter in Moscow who one day got into trouble with his editor for not taking notes at a staff meeting. In the midst of the reprimand, S. shocked his boss by matter-of-factly repeating everything that had been said in the meeting—word for word.

This was apparently no stretch at all for S., who, it emerged upon closer examination, remembered virtually every detail of sight and sound that he had come into contact with in his entire life. What’s more, he took this perfect memory entirely for granted. To him, it seemed perfectly normal that he forgot nothing.

The editor, amazed, sent S. to the distinguished Russian psychologist A. R. Luria for testing. Luria did test him that day, and for many other days over a period of many decades. In all the testing, he could not find any real limit to his capacity to recall details. For example, not only could he perfectly recall tables like this one full of random data after looking at them for just a few minutes:






And not only could he efficiently recite these tables backwards, upside down, diagonally, etc., but after years of memorizing thousands of such tables he could easily reproduce any particular one of them, without warning, whether it was an hour after he had first seen it, or twenty years. The man, it seemed, quite literally remembered everything.

And yet he understood almost nothing. S. was plagued by an inability to make meaning out of what he saw. Unless one pointed the obvious pattern out to him, for example, the following table appeared just as bereft of order and meaning as any other:






“If I had been given the letters of the alphabet arranged in a similar order,” he remarked after being questioned about the 1–2–3–4 table, “I wouldn’t have noticed their arrangement.” He was also unable to make sense out of poetry or prose, to understand much about the law, or even to remember people’s faces. “They’re so changeable,” he complained to Luria. “A person’s expression depends on his mood and on the circumstances under which you happen to meet him. People’s faces are constantly changing; it’s the different shades of expression that confuse me and make it so hard to remember faces.”

Luria also noted that S. came across as generally disorganized, dull-witted, and without much of a sense of purpose or direction in life. This astounding man, then, was not so much gifted with the ability to remember everything as he was cursed with the inability to forget detail and form more general impressions. He recorded only information, and was bereft of the essential ability to draw meaning out of events. “Many of us are anxious to find ways to improve our memories,” wrote Luria in a lengthy report on his unusual subject. “In S.’s case, however, precisely the reverse was true. The big question for him, and the most troublesome, was how he could learn to forget.”

What makes details hazy also enables us to prioritize information, recognize and retain patterns. The brain eliminates trees in order to make sense of, and remember, the forests. Forgetting is a hidden virtue. Forgetting is what makes us so smart.



One of the worst things that I have to do is put on my pants in the morning. This morning I kept thinking there is something wrong because my pants just didn’t feel right. I had put them on wrong. I sometimes will have to put them on and take them off half a dozen times or more.… Setting the washing machine is getting to be a problem, too. Sometimes I’ll spend an hour trying to figure out how to set it.

—B.

San Diego, California




Chapter 4 THE RACE (#ulink_4f8dda50-870c-535b-a1e3-331b5201ccb5)


Taos

“Ten years to a cure,” a Japanese scientist whispered to me in our hotel lobby as we waited for the shuttle bus to the Taos Civic Plaza.

The whisper was as telling as the words. He couldn’t contain his optimism, and yet he also couldn’t afford to put it on display.

Other Alzheimer’s researchers had lately been adopting a similar posture. As scientists, they were reserved by nature. But the recent acceleration of discovery had made them a little giddy. Hundreds of important discoveries had come in recent years, and funding for research was way up. The study of Alzheimer’s was now in the top scientific tier, alongside heart disease, cancer, and stroke research. This seemed fitting, since the disease was emerging as one of the largest causes of death in the U.S., not far behind those other three.

There was now even an Alzheimer’s drug on the market. Aricept, introduced in 1997, which boosted the brain’s supply of the neurotransmitter acetylcholine. Some of the functional loss in early Alzheimer’s involves a deficiency of acetylcholine; replenishing it with this drug seemed to help about half of early and middle-stage patients to slow or even arrest the progression of symptoms for a year or more.

On the one hand, this was a giant advance: a real treatment that often made a tangible difference. But it was also a frustrating baby-step: Aricept did not slow the advance of the actual disease by a single day. It only worked on the symptoms. Scientists couldn’t stop Alzheimer’s yet—only put a thick curtain in front of it for a while.

More ambitious advances were brewing. An electronic update service named Alzheimer’s Weekly had been launched in 1998. Neurologists in the 1960s would have considered this phrase a sarcastic reference to the drudging nature of discovery: Understanding of the disease was practically frozen for more than half a century. But after a thaw in the 1970s and a renewed effort in the ’80s, genetic and molecular discoveries started to cascade so quickly by the mid-1990s that the excavation of Alzheimer’s seemed to be moving at the same clip as sporting events and financial markets.

Now a weekly update was not only useful but essential. In fact, updates on other Web sites came almost daily:

News from the Research Front



3 September 1998. H. J. Song et al. report that they are able to manipulate growth cones …

5 September 1998. Puny polymer pellets show promise as a vehicle for delivering nerve-growth factor to the basal forebrain …

6 September 1998. A novel brain-imaging agent promises to open up a window on the functioning of the brain’s dopamine system …

10 September 1998. Findings published in Nature Neuroscience indicate that the accumulation of calcium in the mitochondria triggers neuronal death …

10 September 1998. C. Y. Wang et al. report they have identified four genes that are targets of NF-kB activity …

11 September 1998. E. Nedivi et al. describe CPG15, a molecule that enhances dendritic arbor growth in projection neurons …

—from the Alzheimer Research Forum (at www.alzforum.org)

The research was so intensely specialized that few individual scientists appeared to even be working on the problem of Alzheimer’s disease per se. It was more like each was unearthing a single two-inch tile in a giant mosaic. By themselves, these individual experiments were so narrowly focused that they were far removed from a comprehensive understanding of the disease. But the minutiae had a purpose. If the great challenge of Alois Alzheimer had been to distinguish a general pathology of dementia from the normal cells of the brain, the task of contemporary scientists—employing exotic techniques with names like fluorescent protein tagging, immuno-lesioning, and western blot analysis—was to try to see what the process looked like in flux. Alzheimer glimpsed a mono-colored, silver-stained microscopic snapshot. Contemporary scientists, crunching and exchanging data with parallel processors and fiber optics, were trying to patch together more of a motion picture. Once they understood the actual disease process, particularly the early molecular events, they hoped they would be able to proceed toward genuine therapies.

The research had expanded in every direction, and had also gone global. Thousands of scientists from every continent now worked on the problem, as time became critical. In a little over a decade, the much-anticipated “senior boom” would begin, eventually quadrupling the number of Alzheimer’s cases and making it the fastest-growing disease in developed countries. In addition to the sheer misery, the social costs of such a slow, progressive disease would be staggering. In the U.S., the costs of doctor’s visits, lab tests, medicine, nursing, day care, and home care was already estimated to be $114.4 billion annually. That was more than the combined budgets for the U.S. Departments of Commerce, Education, Energy, Justice, Labor, and Interior.

“We have to solve this problem, or it’s going to overwhelm us,” Zaven Khachaturian said. “The numbers are going to double every twenty years. Not only that: The duration of illness is going to get much longer. That’s the really devastating part. The folks who have the disease now are mostly people who came through the Depression. Some had college education, but most did not. The ones who are going to develop Alzheimer’s in the next century will be baby boomers who are primarily much better educated and better fed. The duration of their disability is going to be much longer than the current crop. That’s going to be a major factor.

“See, in considering the social impact of the disease, it’s not so much the pain and suffering that matters. From the point of view of the individual, that is of course the important factor. But from the point of view of society, what’s important is how long I am disabled and how much of a burden I am to society. With cancer and heart disease, the period where I cannot function independently is fairly short—three to five years. With Alzheimer’s, it’s going to be extremely long—like twenty years, where you are physically there, you don’t have any pain, you appear normal, and yet you have Alzheimer’s. You cannot function independently.”

If Khachaturian was correct, and the average duration of the disease was set to more than double, then the problem would be even worse than epidemiologists were predicting. Either way, it was clear that if Alzheimer’s disease was not conquered reasonably soon, it would become one of the most prominent features of our future. Nationally, the number of nursing home beds would at least quadruple. (The stay of Alzheimer’s sufferers in a nursing home is, on average, twice as long as that of other patients.) We would need vastly more home health care workers, elder-care nurses and physicians, assisted living facilities, day-care programs and support groups. (There was already a grave shortage of qualified professional caregivers—and, due to the low pay, a shocking annual turnover rate of 94 percent.) Family leave would also have to be redefined. Progressive nations would likely adopt a system of employee flexibility for senior care (extended leave, flexible work hours, and so on) similar to the one recently implemented for new parents in the U.S.—with the added caveat that revene parenthood lasts significantly longer and is more draining than conventional parenthood.

All this would cost money, and would require a painful shift in resources away from other public needs. Public officials would be forced into difficult decisions. Would the U.S. government, for example, continue to allow an Alzheimer’s patient to give away all assets to his children in order to qualify for government-sponsored care? Would governments require citizens to have some sort of dementia or frailty insurance?

And what about public safety issues? With as many as fifteen million people suffering from insidious (and largely invisible) cognitive decline, how would we insure street and highway safety without automatically invalidating all driver’s licenses of senior citizens?

There was a dual race on, then. Researchers were racing against one another, and against time. The prize for the winner of this race—if there was to be a winner—would be worldwide fame, nearly unparalleled professional esteem, enormous wealth, and the pride of knowing that you were personally responsible for preventing an ocean of future human suffering.

One glimpse into the magnitude of an Alzheimer’s cure: In the nearly fifty years since Jonas Salk and Albert Sabin introduced their vaccines against polio, somewhere between 1 and 2 million lives have been saved. Curing Alzheimer’s disease sometime in the first decade of the twenty-first century would save as many as 100 million lives worldwide in the same length of time.

The far-flung researchers kept in touch by E-mail, phone, and fax, and accomplished much in their labs spread out all over the world. Still, every so often, they needed to come together physically, to be in the same room to check their progress, to goad each other, critique and criticize each other, to energize.

In 1999, the gathering place was Taos. They came from everywhere, a global convergence of neuromolecular intelligentsia sitting on fold-out chairs in Bataan Hall to share knowledge and probe their ignorance. There was still so much they didn’t know: Why are women more susceptible to Alzheimer’s than men? Why are Cree and Cherokee Indians less susceptible than the rest of us? What is it about the environment in Hawaii, as contrasted with Japan, that apparently doubles one’s chances of getting Alzheimer’s? Why do a third of Alzheimer’s victims develop Parkinson’s disease but not the other two-thirds? Why do some cigarette smokers seem to be less likely to develop Alzheimer’s than nonsmokers?

After ninety-plus years, the field was littered with half-answers to these questions—and far more basic ones: Does Alzheimer’s have one cause or many? Is it really one disease or a collection of very similar diseases? Which come first—plaques or tangles? Why do they always originate in the same part of the brain? How long do they proliferate before they begin to affect brain performance? Why do some people accrue a brain full of plaques and tangles but never display any symptoms of the disease? Is anyone naturally immune to Alzheimer’s?

So, humbly, they gathered. With respect for the vexing nature of this disease, the molecular biologists and geneticists spent thirty hours listening to theories of plaque and tangle formation, and intervention strategies. After each short talk, they quickly lined up behind a microphone in the aisle to poke the presenter with questions, looking for holes in the research and analysis. The tone was alternately respectful and suspicious, and occasionally hostile.

Hostile because of the billions of dollars at stake, and also because of a fracturing debate within the community about which aspect of research mattered most. A nearly one-hundred-year-old question still had not been answered: Which are closer to the root of the problem—the plaques or the tangles?

Alois Alzheimer thought it was the tangles. “We have to conelude,” he wrote in 1911, “that the plaques are not the cause of senile dementia but only an accompanying feature.”

Most, however, now said the plaques. In a field where there were so many open questions and possible approaches, the vast majority of researchers in this room and elsewhere were focused tightly on the issue of plaque formation, while relatively few were concerned with tangles and only a handful of others busied themselves with important issues like inflammation, viruses, and possible environmental factors.

The disparity bothered many. “When I was a little girl, I wanted to go into science because I thought it was a very open community,” Ruth Itzhaki, a biologist from the University of Manchester, told me one morning in Taos. “I learned better. It is, in fact, a very cynical community layered with politics and filled with people who just want to follow the herd.” Itzhaki was herself embittered by her struggle to fund research linking the herpes simplex virus 1 (HSV1) with Alzheimer’s.

Could Alzheimer’s be herpes of the brain? It was not the most prominent theory of the day, but no one could rule it out. Nearly all humans are infected by HSV1 by the time they reach middle age. The virus mostly seems to lie dormant but can become active and create cold sores and other hazards in times of stress. Whether or not HSV1 does any damage depends largely on individual levels of immune response and on genetic makeup.

In her presentation, Itzhaki said she had found evidence of HSV1 presence in the temporal and frontal cortex of the brain, as well as in the hippocampus—three areas closely associated with Alzheimer’s. She posited that the virus might be interacting with a particular gene to set the disease process in motion. If proven true, a massive new global infant immunization project would be in order.

But the crowd in Taos did not seem very interested. Her talk drew little in the way of response. The focus quickly shifted back to plaques.



One evening I got a telephone call from a friend. He was telling me what a tough day he’d had on the job; he’d made several mistakes. “If you have Alzheimer’s, I must have a double dose of it,” he said.

I could feel myself entering a state of rage. “Do you forget simple words, or substitute inappropriate words, making your sentences incomprehensible? Do you cook a meal and not only forget you cooked it, but forget to eat it? Do you put your frying pan in the freezer, or your wallet in the sugar howl, only to find them later and wonder what in the world is happening to you? Do you become lost on your own street? Do you mow your lawn three or four times a day? When you balance your checkbook, do you completely forget what the numbers are and what needs to be done with them? Do you become confused or fearful ten times a day, for no reason? And most of all, do you become irate when someone makes a dumb statement like you just made?”




Конец ознакомительного фрагмента.


Текст предоставлен ООО «ЛитРес».

Прочитайте эту книгу целиком, купив полную легальную версию (https://www.litres.ru/david-shenk/the-forgetting-understanding-alzheimer-s-a-biography-of-a-dise/) на ЛитРес.

Безопасно оплатить книгу можно банковской картой Visa, MasterCard, Maestro, со счета мобильного телефона, с платежного терминала, в салоне МТС или Связной, через PayPal, WebMoney, Яндекс.Деньги, QIWI Кошелек, бонусными картами или другим удобным Вам способом.


The Forgetting: Understanding Alzheimer’s: A Biography of a Disease David Shenk
The Forgetting: Understanding Alzheimer’s: A Biography of a Disease

David Shenk

Тип: электронная книга

Жанр: Спорт, фитнес

Язык: на английском языке

Издательство: HarperCollins

Дата публикации: 16.04.2024

Отзывы: Пока нет Добавить отзыв

О книге: Winner of the 2002 BMA Popular Medicine Book Prize: This is a haunting literary and scientific examination of Alzheimer’s disease and the race to find a cure.‘A truly remarkable book – the definitive work on Alzheimer’s, both in social and medical terms, “The Forgetting” is incisive, humane, never ponderous, full of dry humour and brilliantly written with quiet, unpretentious authority. As a layman with personal experience of “caring” for an Alzheimer’s sufferer I am well aware of the stages of the disease and its prognosis and ending. Shenk is excellent on all these, and in his reflections on memory and the individual, and the individual’s response to the progress of the disease. I can’t imagine a book on Alzheimer’s being better researched and understood, or presented with greater sympathy.’ John BayleyIn 1906 Alois Alzheimer dissected and examined the cerebral cortex of Auguste D’s brain and became the first scientist in medical history to link a specific brain pathology to behavioural changes. The disease named after him, turns otherwise active and healthy people into living ghosts. It is a rare condition for those in their 40s and 50s but 10% of the 65+ population suffers from it and 50% of the 85+. It is longevity’s revenge and as the baby boom generation drifts into its elderly years the number of Alzheimer’s victims is expected to quadruple, making it the fastest-growing disease in developed countries.As Adam Phillips writes in his foreword ‘This remarkable book will radically change our notions of looking after people and our assumptions about independence. Out of fear of mortality we have idealised health and youth and competence. “The Forgetting” reminds us among many other things that there is more to life than that.’Shenk’s history of Alzheimer’s is both poignant and scientific, grounded by the fundamental belief that memory forms the basis of our selves, our souls, and the meaning in our lives.

  • Добавить отзыв