Here's a possible dilemma for you to think through. It's time for your yearly eye checkup. So one evening, you drop by your friendly nearby optometrist. You're expecting her to sit you down as usual, get you to try reading the usual chart across the room that has letters getting steadily smaller on each row, offer you a card with bits of text in varying font sizes to read ... the usual.
Only, on this visit she tells you there is a new blood test that can, believe it, tell you if you might need corrective lenses at some point in the future. She says there's no need to do all the reading tests you're used to. In fact, you now notice that she no longer even has the chart across the room.
Given that she's only administering blood tests now - she's no longer testing your eyes for a decline in vision - the label "optometrist" no longer seems appropriate. In any case, she assures you the test is reliable. How, though, would you react?
While you ruminate over that, consider a parallel of sorts in recent news about Alzheimer's disease. For a long time now, Alzheimer's has been detected by testing a patient's cognitive function. If such a screening hints at a decline, the neurologist will probably ask for more tests. Eventually she might suspect Alzheimer's and ask for a brain scan and a spinal tap.
This is because the disease is known to do two things to the brain. One, it leaves deposits ("plaques") of amyloid-β proteins. Two, it produces aggregations of tau proteins that are known as "tangles". Never mind what each of these proteins actually are. Both are called "biomarkers" of the disease, meaning their presence in the brain strongly suggests the onset of Alzheimer's. Their presence along with an observed cognitive decline is an almost certain diagnosis of Alzheimer's.
So if the brain scan shows the amyloid-β plaques and the tau tangles, the neurologist will likely tell the patient they have Alzheimer's. Together, they will work out next steps. For example, they might decide to treat the disease with one of the drugs that have come to market in recent years, like Leqembi. Leqembi won't cure the disease - we know of nothing (yet) that can do that. But it has been shown to slow Alzheimer’s-related cognitive decline by clearing amyloid-β plaques from the brain.
I don't mean to reduce what is necessarily a complex, drawn-out and emotional process to the three paragraphs above - three sketchily-informed paragraphs at that, I'll freely admit. I just want to offer a flavour of how people learn about both the onset of Alzheimer's and what they can do about it. For it has got to be hard to get used to the reality of Alzheimer's.
Yet there's now something of a dust-up in scientific circles over Alzheimer's diagnoses. That's because in February this year, a research team announced - cue the parallel to the thought experiment at the start of this essay - "a simple blood test to diagnose Alzheimer’s disease." In essence, this test measures how much of the Alzheimer's proteins are present in the blood. The scientists show that this blood test can be as effective at detecting early signs of the disease as tests of cerebrospinal fluid and brain scans - but will be far less expensive than those.
Since the new drugs like Leqembi "might be more effective when started sooner rather than later", it becomes critical to identify Alzheimer's cases as early as possible. That's the great virtue of this new blood test. The scientists believe it "can be used to detect molecular signs of Alzheimer’s disease in the brain when symptoms haven’t yet emerged." As one member of the team remarked: "What we really want is to treat the disease before people start losing brain cells and showing symptoms."
The suggestion here is to eventually use only this blood test to diagnose Alzheimer's. (The name "Theranos" comes to mind, say no more.) No need to test for cognitive decline.
Cue the dust-up. At issue is that the presence of the proteins does not necessarily mean the patient does or will soon suffer from Alzheimer's. It does not mean the patient has or will soon show signs of cognitive decline. In fact, it's possible she may never suffer such a decline, may never get Alzheimer's. As one critic of the blood test said, "There’s a risk of misunderstanding and distress that individuals who are asymptomatic will have if we tell them they have Alzheimer’s [after the blood test], whereas nothing will happen in their lifetime in a majority of cases." After all, a 65 year-old man, in whose blood the test finds the proteins in question, has only a 22% risk of the Alzheimer's cognitive decline.
So is it reasonable to tell the patient - maybe a completely healthy 65 year-old man - that he is at risk and so should start the Leqembi regimen right away? This diagnosis devastates him, and yet it's possible he may never get Alzheimer's. But he feels compelled to start nevertheless with one of the new drugs - which are ineffective because he has no symptoms of the disease, and they have their own possible serious side-effects too.
Is this reasonable? Well, is it reasonable to rely on a blood test that says you might at some point need corrective lenses for your eyes? Are these two situations - one fictional, one real - comparable?
I won't answer those questions, though perhaps it's clear what my answers might be. Instead, I'll leave you with something I read about recently.
In a town called Salta in Argentina a few years ago, the journalist Madhumita Murgia met two brothers who collaborated with Microsoft on a piece of software that was to be used by the local administration. She writes about this in her book Code Dependent.
And what was this software all about? You need to know that Salta is substantially poorer than the rest of Argentina, and has a substantially higher rate of teenage pregnancies. There are reasons, but I won't get into those here. But to start with, the software would need a lot of data about each of the town's young women - "her socioeconomic history, her education, her reproductive history" and more. Armed with this data, Microsoft suggested using the "power of AI ... to build algorithms to predict which girls were likely to get pregnant in their teens." In fact, the brothers hoped to "generate a list of families whose daughters' risk of teen pregnancy was above 60 per cent."
Sure enough, the then Governor of Salta proclaimed that this software "can predict five or six years ahead which girl, or future teenager, is 86 per cent predestined to have a teenage pregnancy." His government could then focus on those girls and "help prevent" the pregnancies.
The "86 per cent predestined" pregnancies, if you please.
To Murgia, this whole effort was "problematic". To me too, for that matter, though I'd use a far stronger word. But it made me think hard about something that was once my career - software. That then made me think hard about blood tests.
And putting it all together, I'm left wondering: is a teenage pregnancy ever "predestined"? What are the ethics of treating it like that? That apart, of course I realize the parallels between a societal dilemma and a medical condition only go so far. Still, is Alzheimer's ever "predestined" in an analogous way?
Should we leave it to a simple blood test?
Write a comment ...