Sunday, April 03, 2011

Placebo: How a sugar pill became a poison pill. Part 2 of a continuing saga...

Read part 1.

In this mad rush to fill the collective national bloodstream with bug-killing fluids, doctors paid little attention to “subtleties” like proper dosing. An antibiotic regimen marred by ill-conceived dosing—too-small concentrations that end too soon—decimates bacteria but does not kill them off entirely. Although the penicillin-aided body may rise up and overwhelm the microbes in the present instance of infection, the few surviv
ing bacteria—as if bent on proving the wisdom of the famous Nietzsche quote about “what doesn't kill us”—form the prospective beginnings of a stronger super-strain. The hardy stragglers find a new host and breed wildly, now immune to the original, haphazardly administered antibiotic.

Here our narrative takes a brief but important detour to Mexico. As the Ozzie
& Harriett era hit its stride, a new wave of Mexican immigrants streamed across the border, excited by this revivified American Dream exploding just to the north. Sensitive to the needs of these new arrivals—most of whom were separated from formal healthcare by barriers linguistic and financial—pharmacias in fast-growing Hispanic neighborhoods quietly began selling antibiotics over-the-counter. In turn, their customers began self-dosing indiscriminately (and, when feasible, sending mercy shipments of some portion of their pharmaceutical “score” back across the border to relatives who'd stayed behind). The renegade druggists, too, knew that their compassion was apt to have little or no bearing on a patient's actual health, but by this time they were part of an inexorable zeitgeist, an authentic cultural revolution. They had gotten swept up in a current that washed away reason.

What the various
component parts of this vast antibiotic-dispensing machinery could not have known or even imagined at the time was that they were also incubating an accidental business model. And much like a highly contagious cold itself, that business model one day would infect every precinct of healthcare, from the neighborhood Marcus Welby to the Mayo Clinic, with devastating implications for the practice of medicine. The business model was premised on a deceptively simple idea: that you had to treat something, even if you realistically had nothing to treat it with. Indoctrinated in this thinking from the very beginning of their academic careers—and bearing complete faith in the new technologies forever being supplied to them by a burgeoning medical-equipment industry—future generations of doctors would come to conceive cancer, heart disease and other major ailments in the same way their predecessors once conceived the common cold.

If this mentality were reduced to a bumper sticker, it would be an updated twist on the old Cartesian principle of existence: I'M A DOCTOR, ERGO I TREAT.

At the same time, a complementary phenomenon was taking shape in the form of a psychological pas-de-deux danced by doctors and their patients. The doctors, for their part, were cultivating what might be called a placebo affect: a bedside manner and reassuring patois designed to inspire a level of faith in their healing arts that was seldom vindicated by the underlying science. Meanwhile, large numbers of Americans fulfilled their part of their bargain by going to the doctor for the sheer peace of mind of, well, going to the doctor. “I think the family physician became a kind of security blanket for suburban America,” says Penn State's Arthur Caplan, one of the nation's best-known bioethicists and a keen-eyed chronicler of medical trends. “It's as if the medical outcome was almost secondary.” Healthcare thus became a misguided alliance between physician and patient in which the latter came to rely on the former for a psychosomatic cure-all: The visit was itself the treatment, allowing the patient to believe that something was being done, whether or not such was legitimately the case. “Most people who go to the doctor are going to get better anyway,” explains Dr. Sally Satel, co-author of One Nation Under Therapy: How the Helping Culture is Eroding Self-Reliance. “That's because most illnesses are self-limiting. They won't kill you, and sooner or later they just go away. But that's not how the average person looks at it. People feel they're supposed to go to the doctor when they're sick, and then when they get better, they credit the doctor and his treatment.”

In the 1950s, Americans dragged themselves and their families to the doctor in droves. During the so-called “Golden Age of Antibiotics,” between the post-war period and the early 1980s, human life expectancy jumped by an ostensible eight years, an increase then attributed to the wide availability of antibiotic compounds. (This too would prove fallacious.) So encouraging was the trend-line that in 1969, U.S. Surgeon General William Stewart smugly announced the end of pathogen-borne illness: In a brash and boastful speech to Congress, he told legislators it was “time to close the books on infectious diseases.” For sheer naivete, the remark rivaled the line attributed to Charles Duell, commissioner of the U.S. Patent Office at the dawn of the 20th Century. Duell is said to have lamented that there was little future in his line of work, since “everything that can be invented has already been invented.”*

We now know better. We know because the world is living with the fallout from indiscriminate use of antibiotics and the haughtiness that occasioned it. And we're feeling the effects of that fallout not just in terms of pharmaceuticals, but in every single area of medical practice.

To be continued...

* In the interest of journalistic accuracy, whether Duell actually made the remark in exactly those words remains controversial.

1 comment:

Elizabeth said...

A fascinating series, Steve. (And great writing, too.)