Marjie Popkin thought she had chemo brain, that fuzzy-headed forgetful state that she figured was a result of her treatment for ovarian cancer. She was not thinking clearly — having trouble with numbers, forgetting things she had just heard.
One doctor after another dismissed her complaints. Until recently, since she was functioning well and having no trouble taking care of herself, that might have been the end of her quest for an explanation.
Last year, though, Popkin, still troubled by what was happening to her mind, went to Michael Rafii, a neurologist at the University of California, San Diego, who not only gave her a thorough neurological examination but administered new tests, like an MRI that assesses the volume of key brain areas and a spinal tap.
Then he told her there was something wrong. And it was not chemo brain. It most likely was Alzheimer’s disease. Although she seemed to be in the very early stages, all the indicators pointed in that direction.
Until recently, the image of Alzheimer’s was the clearly demented person with the sometimes vacant stare, unable to follow a conversation or remember a promise to meet a friend for lunch.
Popkin is nothing like that. To a casual observer, Popkin seems perfectly fine. Articulate and groomed, she is in the vanguard of a new generation of Alzheimer’s patients, given a diagnosis after tests found signs of the disease years before actual dementia sets in.
But the new diagnostic tests are leading to a moral dilemma. Since there is no treatment for Alzheimer’s, is it a good thing to tell people, years earlier, that they have this progressive degenerative brain disease or have a good chance of getting it?
“I am grappling with that issue,” Rafii said. “I give them the diagnosis — we are getting pretty good at diagnosis now. But it’s challenging because what do we do then?”
It is a quandary that is emblematic of major changes in the practice of medicine, affecting not just Alzheimer’s patients. Modern medicine has produced new diagnostic tools, from scanners to genetic tests, that can find diseases or predict disease risk decades before people would notice any symptoms.
At the same time, many of those diseases have no effective treatments. Does it help to know you are likely to get a disease if there is nothing you can do?
“This is the price we pay” for the new knowledge, said Jonathan Moreno, a professor of medical ethics and the history and sociology of science at the University of Pennsylvania.
“I think we are going to go through a really tough time,” he added. “We have so much information now, and we have to try to learn as a culture what information we do not want
Some doctors, like John Morris of Washington University in St Louis, say they will not offer the new diagnostic tests for Alzheimer’s — like MRIs and spinal taps — to patients because it is not yet clear how to interpret them. He uses them in research studies but does not tell subjects the results.
“We don’t know for certain what these results mean,” Morris said. “If you have amyloid in your brain, we don’t know for certain that you will become demented, and we don’t have anything we can do about it.”
But many people want to know anyway and say they can handle the uncertainty.
That issue is facing investigators in a large federal study of early signs of Alzheimer’s. The researchers, who include Morris, have been testing and following hundreds of people aged 55 to 90, some with normal memories, some with memory problems and some with dementia. So far, only investigators know the results. Now, the question is, should those who want to learn what their tests show be told?