Read Part 4.
“Looking back over the past century, it is clear that medical science has made breathtaking advances,” said Roche CEO Franz Humer after a few opening pleasantries. “This is shown, for instance, by the fact that life expectancy has risen enormously to around 80 years, compared with 55 in the late 19th/early 20th century when Roche was established.” He continued in that vein for a while, touting the links between Big Pharma and enhanced lifespan, underscoring his industry's instrumental role in the progress of Mankind, subtly highlighting the Roche brand throughout. “While it may be unwarranted at this point to expect the same historic increases in longevity looking forward,” he conceded toward the end, “there is little doubt that whatever gains we realize will be directly related to what we in our industry do.”
To humanize his point, he shared a few choice stories of striking longevity from the corporate archives, culled from a Roche outreach to doctors prescribing the company's products. When audience members applauded, they were applauding themselves as well.
Today's medicine is full of Franz Humers: men and women who say all the right things because it's in their interest to do so. Nowhere is this ethic stronger or more visible than in the area of life extension. Enhanced longevity via cutting-edge medicine is how the healthcare monolith puts a statistical face on its promises (though on close inspection that face turns out to be more of a carnival mask). From their perspective—and the perspective of most consumers—longevity is the ultimate metric of medical worth. It is how the Franz Humers of the world tangibilize the value of what they do. Longevity is also the quintessential debate-stopper, the trump card that makes all other points sound trifling, all skepticism sound contrary and quarrelsome. If doctors can show (or appear to show) that they deliver added life, then the patient (a) is predisposed to use more medicine and pay more for it and (b) will spend less time questioning what he pays now. In the same way, outsiders will spend less time scrutinizing what the healthcare industry does. Both literally and figuratively, the industry's stock goes up.
When it comes to taking credit for the march toward immortality, the hubris of the medical establishment is boundless and nonsectarian. Some insiders unabashedly project forward, reasoning from the purported gains in longevity over the past quarter-century that humans in the next century can routinely expect to live well past 100. Dr. Donald B. Louria of the Healthful Life Project—subsidized by Roche—writes, “I believe the question increasingly is not whether life expectancy in the United States at birth will increase from the current 77 years to 100 or even 120 years, but when.” Such musings are a way of setting a tone, establishing a warm buying (and investing) climate, without the point-by-point specificity that could come back to haunt someone later if construed by the SEC as an improper type of forward-looking statement.
It's an artful balancing act, and one that industry insiders have honed and perfected, such that no matter who's speaking, there's a decided sound-bite sameness to the words. Another pharmaceutical titan, Merck & Co., manufacturers the anti-cholesterol compound, Zocor. Here's Merck CEO Richard Clark (total 2009 compensation: $19.9 million) expounding on anti-aging: “At the start of the previous century, humans barely made it into their 50s. Today they routinely survive well into their 80s or more. The average lifespan is close to 78. How much clearer could the benefits of medicine be?” Hmm. Now where have we heard that before? Pioneering Texas Heart Institute surgeon Dr. Denton Cooley, who in 1969 performed America's first heart transplant, sounded as if he and his colleagues deserved personal thanks for improving longevity when he told a 1991 interviewer, “It's simple math: A half-million times each year surgeons do the [coronary bypasses] that we perfected here at the Institute... Those are a half-million men and women each year who have lived on and helped create this century's 25-year improvement in lifespan.” And in a keynote speech to another major symposium, Australian Health 2000, Michael Wooldridge, the nation's former Minister for Health, had this to say of Australia's track record in the area of life extension: “There has been a 20-year gain in life expectancy, for men from 55 years in 1900 to 76 years today and for women from 59 years to 82 years. Millions owe their very lives to the pills they take each morning with their juice and toast.”
It's not surprising that Wooldridge would credit much of that progress to pharmaceuticals, since he came from that sector of healthcare, and hoped to return there after his stint in government. This revolving door between politics and industry is another factor that raises questions about the credibility of the “official” information the public gets on longevity. During his tenure as Health Minister, Wooldridge was accused of having too-cozy ties with Big Pharma; critics alleged a serious conflict of interest that compromised government oversight of the drug industry. Unfazed, Wooldridge promptly appointed to a major healthcare regulatory agency a former executive of GlaxoWellcome-Australia who had just retired as head of the Australian Pharmaceutical Manufacturers' Association. (GlaxoWellcome is now part of the $28 billion GlaxoSmithKline empire. CEO Andrew Witty's total 2009 compensation including bonuses: $12 million.) A few years after Wooldridge left government, the Pharmaceutical Manufacturers—perhaps grateful for his advocacy and friendship?—appointed him head of their association.
Although it is not known whether Michael Wooldridge's audience applauded at the end of his speech that day, many in attendance were the same types who gave such a warm reception to Humer: numbers types, bottom-line types. Albeit a particular kind of numbers and bottom lines: the kind that appear in prospectuses and annual reports to shareholders. For the folks in Healthcare Central, those numbers have been quite good for quite some time.
And because they want things to stay that way, they're coy about another set of numbers—the ones that would reveal the grievously disappointing truth about supposed advances in human longevity.
To be continued...
Monday, April 25, 2011
Read Part 4.
Sunday, April 17, 2011
Read part 3.
On a fine spring afternoon in March 2005, pharmaceutical executive Dr. Franz Humer, a man who'd grown accustomed to the applause of his peers, rose at a major Zurich symposium to deliver a speech guaranteed to generate more of the same.
Humer had a knack for painting colorful frescoes of the oft-hazy Big Pharma landscape. Despite fierce competition for market share in their respective niches—e.g. Viagra vs. Cialis vs. Levitra for erectile supremacy—the industry's major players periodically would call a truce long enough to attend conferences that celebrated their collective genius. In that respect they were like Hollywood on Oscar Night: Whatever jealousies exist over box-office receipts, whatever backstage machinations are in play over plum roles, they fade away as one and all bask for a few hours in the magic of make-believe. As it happens, that's an analogy with more than passing relevance here.
Humer was then CEO and board chairman of F. Hoffman-La Roche, nowadays simply “Roche.” Also known for his nonpareil efficiency, Humer that morning had made the 85-km commute from his company's Basel headquarters ensconced in the buttery luxury of a stretch Mercedes limo from Roche's corporate fleet. Had the trip been much longer, he would've flown, not driven, likely on Roche's corporate jet, which was similar to the jet Humer kept for his personal use, though admittedly a tad larger.
From Humer's point of view, the timing of the speech could hardly be better. With the first fiscal quarter not yet complete, Roche was well on its way to posting record sales of $35 billion and returning to investors an incremental yield of 50 percent over 2004. Part of this success could be traced to the company's established positions in the perennially hot mental-health segment, where Roche boasts a track record that few can match. In the 1950s the company pioneered the game-changing class of anti-anxiety drugs known as benzodiazepines. The hit parade began in 1957 with Librium, which became an instant (and over-prescribed) darling of psychiatrists everywhere. Three years later Roche topped even that by launching one of the most commercially successful drugs ever, Valium. To this day the drug remains a staple not just for treatment of anxiety but for surgical sedation as well. But Roche's soaring good fortunes in the spring of 2005 had more to do with its distribution of the first oral drug approved for use against the two primary types of influenza—Tamiflu—which Roche had licensed a decade earlier from a U.S. biotechnology firm, Gilead Sciences. In truth, the Roche/Gilead relationship was a stormy one. Tamiflu had been subject to a series of formal warnings and recalls that led to messy litigation in which the biotech David accused the pharmaceutical Goliath of uninspired marketing, poor quality control, and miscalculating Gilead's royalty payment on sales of the drug. Still, with fears of avian flu then sweeping the globe faster than the malady itself, and with other flu strains suddenly glowing on parents' radar screens, Tamiflu was the right drug at the right time. The two companies called off their legal teams and patched things up, and Roche was now raking in profits hand over fist.
All of which delighted Franz Humer, who was, first and foremost, a money man. The “Dr.” before his name misleads. Humer's doctorate is in law, not medicine. Also, like many of those at today's upper echelons of medical administration, Humer holds an MBA. At the time of his speech, he had occupied his lofty position at Roche since 1998, and in the intervening years had seen the right side of the corporate sales chart climb ever higher. (Such achievements helped soften the sting of Roche's being named, by one leading consumer watchdog group, “top corporate criminal of the 1990s” for its “anti-consumer, anti-competitive practices.”) In 2005, Humer himself would receive a base salary of 8.4 million Swiss francs, which sounds like a lot of money until you realize that it's barely $8.3 million U.S. Fortunately for Humer, his contract included substantial bonuses and equity participation, both of which were about to kick in with a vengeance. The total compensation package made him the third-best-paid CEO among Europe's publicly traded companies.
As an administrative hired gun, Franz Humer was an elite member of a managerial species that in recent decades has taken over from the doctors and scientists who once ran organized medicine. (After leaving Roche in '08, Humer became a board member at Diageo, whose only connection to healthcare is that its customers sometimes need it: Diageo's top brands include Smirnoff, Jose Cuervo and Captain Morgan.) These new-breed healthcare honchos recognize that Job 1 is to hit quarterly earnings targets, to deliver a healthy bottom line. An increasingly healthy bottom line. Further, they must deliver it again and again, thus meeting Wall Street's ever-rising expectations, feeding its insatiable hunger for More.
Self-aggrandizement—of their companies, their products, their managerial acumen—is a key part of the business plan. In pharmaceuticals especially, perception is reality. If the consuming public thinks that certain pills will melt its fat or mute its migraines, then the manufacturer of those pills makes money and The Street is happy, regardless of whether the pills' effects can be documented in a scientific way. It is therefore essential that top brass miss no opportunity to reinforce these perceptions by accentuating the positive. On that day in March of 2005, Franz Humer was accentuating big-time.
To be continued...
Sunday, April 03, 2011
Read part 2.
To be sure, the early warning signs were there for the seeing. Sulfa drugs actually predated penicillin, but within two years of their debut became problematic for general usage because so many of the target bacteria had mutated into resistant strains. This omen would be ignored, as would a gloomy caution from the father of penicillin himself.
“The greatest possibility of evil in self-medication is the use of too-small doses so that instead of clearing up infection the microbes are educated to resist penicillin,” Alexander Fleming lamented to The New York Times in 1945. Fleming foresaw the debut of types of “septicaemia or pneumonia which penicillin cannot save.”
Similar clarion calls were issued, but not heeded, from time to time in the decades that followed. “Little by little,” wrote Harvard infectious disease expert Dr. Maxwell Finland in a 1978 editorial for the New England Journal of Medicine, “we are experiencing the erosion of the strongest bulwarks against serious bacterial infection.” No matter. Two additional decades would pass before a meaningful concerted effort was undertaken to stem the ever-rising tide of antibiotic proliferation. That top-down effort commenced in 1999, a year after doctors and hospitals achieved a dubious milestone: They wrote an unprecedented 80 million prescriptions for penicillin, streptomycin and other antibiotics. That encompassed some 25 million pounds of pills flying out of American pharmacies and hospital dispensaries.
By that time an irreparable amount of damage had been done. In the years between 1945 and 1998, almost every known bacterial pathogen developed resistance to one or more commonly used antibiotics. By 1984, half of all Americans who contracted tuberculosis had a strain that resisted at least one antibiotic. In a recent WHO study, 25 percent of cases of bacterial pneumonia involved microbes that were resistant to penicillin, and an additional 25 percent were complicated by resistance to multiple antibiotics. Each year Americans contract 150,000 cases of pneumonia and 15,000 cases of bacterial meningitis for which effective antibiotics cannot be found. A sobering percentage will die, especially if the patients are elderly and/or infirm.
Bacteria have proved to be authentically diabolical foes of Mankind, displaying a Darwinistic genius that is unmatched in Nature. It appears that one strain can even share key parts of its genetic coding with another strain, thereby “teaching” a wholly different class of bacteria how to defeat a given drug. So it is that nearly all strains of Staphylococcus aureus—which, in the 1950s, could be treated successfully with a single penicillin regimen—today are resistant not just to penicillin but to many other antibiotics as well. Collectively these super-strains are lumped under the familiar umbrella term Methicillin-resistant Staphylococcus aureus, or “MRSA.” MRSA is deemed responsible for 14,000 annual hospital deaths that would not (and should not) have occurred, based on the ailments for which those victims initially were admitted. For surviving patients, MRSA vastly complicates their recoveries and extends their average hospital stay threefold. To such grim figures one must add the 140,000 annual emergency-room visits caused directly by adverse antibiotic reactions, as per a CDC study reported in September 2008. Estimates of the incremental cost antibiotic resistance inflicts on society range as high as $35 billion.
In some hospitals and childcare settings, antibiotic resistance is so entrenched that any attempt at treatment with low-cost, garden-variety antibiotics is pointless. Practitioners must turn instead to more costly, exotic compounds—a tactic that inevitably nurtures resistance to those drugs, too, thus calling for even newer and costlier compounds. “Treatment” becomes a maddening catch-22, a frantic race to stay a step ahead of the ever-evolving infectious agents. It's not only alarmists who think this is a race that bacteria just might win: that the bugs will develop immunity to all existing pharmaceutical countermeasures before we get enough new drugs into the pipeline to combat them. (Strangely enough, it may be the lowly but nearly indestructible cockroach that provides an answer to this riddle. New research shows that chemicals in a roach's brain provide the creature with superior resistance to the bacteria that have themselves become resistant to antibiotic measures. Scientists are exploring ways of harnessing this substance and applying it in human settings.)
The dire predicament has even taken as its casualty many of the antibiotics manufacturers themselves. With the cost of developing and commercializing a new drug now approaching $2 billion, antibiotics manufacturers and their investors have come to see their marketplace niche as unsustainable—a case of throwing good money after bad. Why invest such colossal sums in a new drug that may be obsolete within a year or two? For such reasons, as well as the byzantine vagaries of the drug-approval process, manufacturers are exiting the sector in droves. Thirty-six companies made assorted antibiotics in 1980. At this writing, fewer than a half-dozen remain. Though some of that attrition is a natural byproduct of industry consolidations, there's no question that the sector attracts far less interest these days from biotech firms and the venture capitalists who fund them.
And that is how a sugar pill becomes a poison pill.
To be continued...
Read part 1.
In this mad rush to fill the collective national bloodstream with bug-killing fluids, doctors paid little attention to “subtleties” like proper dosing. An antibiotic regimen marred by ill-conceived dosing—too-small concentrations that end too soon—decimates bacteria but does not kill them off entirely. Although the penicillin-aided body may rise up and overwhelm the microbes in the present instance of infection, the few surviving bacteria—as if bent on proving the wisdom of the famous Nietzsche quote about “what doesn't kill us”—form the prospective beginnings of a stronger super-strain. The hardy stragglers find a new host and breed wildly, now immune to the original, haphazardly administered antibiotic.
Here our narrative takes a brief but important detour to Mexico. As the Ozzie & Harriett era hit its stride, a new wave of Mexican immigrants streamed across the border, excited by this revivified American Dream exploding just to the north. Sensitive to the needs of these new arrivals—most of whom were separated from formal healthcare by barriers linguistic and financial—pharmacias in fast-growing Hispanic neighborhoods quietly began selling antibiotics over-the-counter. In turn, their customers began self-dosing indiscriminately (and, when feasible, sending mercy shipments of some portion of their pharmaceutical “score” back across the border to relatives who'd stayed behind). The renegade druggists, too, knew that their compassion was apt to have little or no bearing on a patient's actual health, but by this time they were part of an inexorable zeitgeist, an authentic cultural revolution. They had gotten swept up in a current that washed away reason.
What the various component parts of this vast antibiotic-dispensing machinery could not have known or even imagined at the time was that they were also incubating an accidental business model. And much like a highly contagious cold itself, that business model one day would infect every precinct of healthcare, from the neighborhood Marcus Welby to the Mayo Clinic, with devastating implications for the practice of medicine. The business model was premised on a deceptively simple idea: that you had to treat something, even if you realistically had nothing to treat it with. Indoctrinated in this thinking from the very beginning of their academic careers—and bearing complete faith in the new technologies forever being supplied to them by a burgeoning medical-equipment industry—future generations of doctors would come to conceive cancer, heart disease and other major ailments in the same way their predecessors once conceived the common cold.
If this mentality were reduced to a bumper sticker, it would be an updated twist on the old Cartesian principle of existence: I'M A DOCTOR, ERGO I TREAT.
At the same time, a complementary phenomenon was taking shape in the form of a psychological pas-de-deux danced by doctors and their patients. The doctors, for their part, were cultivating what might be called a placebo affect: a bedside manner and reassuring patois designed to inspire a level of faith in their healing arts that was seldom vindicated by the underlying science. Meanwhile, large numbers of Americans fulfilled their part of their bargain by going to the doctor for the sheer peace of mind of, well, going to the doctor. “I think the family physician became a kind of security blanket for suburban America,” says Penn State's Arthur Caplan, one of the nation's best-known bioethicists and a keen-eyed chronicler of medical trends. “It's as if the medical outcome was almost secondary.” Healthcare thus became a misguided alliance between physician and patient in which the latter came to rely on the former for a psychosomatic cure-all: The visit was itself the treatment, allowing the patient to believe that something was being done, whether or not such was legitimately the case. “Most people who go to the doctor are going to get better anyway,” explains Dr. Sally Satel, co-author of One Nation Under Therapy: How the Helping Culture is Eroding Self-Reliance. “That's because most illnesses are self-limiting. They won't kill you, and sooner or later they just go away. But that's not how the average person looks at it. People feel they're supposed to go to the doctor when they're sick, and then when they get better, they credit the doctor and his treatment.”
In the 1950s, Americans dragged themselves and their families to the doctor in droves. During the so-called “Golden Age of Antibiotics,” between the post-war period and the early 1980s, human life expectancy jumped by an ostensible eight years, an increase then attributed to the wide availability of antibiotic compounds. (This too would prove fallacious.) So encouraging was the trend-line that in 1969, U.S. Surgeon General William Stewart smugly announced the end of pathogen-borne illness: In a brash and boastful speech to Congress, he told legislators it was “time to close the books on infectious diseases.” For sheer naivete, the remark rivaled the line attributed to Charles Duell, commissioner of the U.S. Patent Office at the dawn of the 20th Century. Duell is said to have lamented that there was little future in his line of work, since “everything that can be invented has already been invented.”*
We now know better. We know because the world is living with the fallout from indiscriminate use of antibiotics and the haughtiness that occasioned it. And we're feeling the effects of that fallout not just in terms of pharmaceuticals, but in every single area of medical practice.
To be continued...
* In the interest of journalistic accuracy, whether Duell actually made the remark in exactly those words remains controversial.