Do rising research costs cause drugmakers to ignore dangerous diseases?
In 1952, George W. Merck, then-president of what is now global pharmaceutical giant Merck & Co., was featured on the cover of Time Magazine with a tagline proclaiming, “Medicine is for the people, not the profits.” Today, the phrase has been appropriated by journalists and critics to taunt the corporation when it runs into controversy, but the reason the quote is still relevant is that it succinctly captures the dueling conceptions that define the pharmaceutical industry. Drug manufacturers are profit-maximizing firms in a market characterized by public goods, somehow expected to maintain price-setting competition that breeds innovation while still filling public health needs regardless of profit potential. In the full quote, Merck goes on to say, “The profits follow, and if we have remembered that, they have never failed to appear.” This profit potential may have been the case during the post-WWII pharmaceutical golden age of antibiotics and vaccines that Merck led his company through. But in today’s industry, where inventing medicine means pushing a billion-dollar investment down a decade-long track with a one-in-10 chance of its survival, this sentiment would probably not hold up in a boardroom. If profits follow, why did Congress pass the Orphan Drug Act to incentivize drugs for rare diseases? And why did drug companies start shunning antibiotic development as health administers gave dire warnings of a drug-resistant apocalypse? Granted that Merck was speaking from a public relations standpoint, but by many accounts, the drug industry of the time did actually live up to this attitude. Drug discovery was systematic and easy, relatively basic science meant less barriers to entry, and the huge demand for antibiotics made profit incentives and public welfare interchangeable. Now, faced with the looming threat that those same antibiotics might one day stop working and end modern medicine as we have come to know it, the question is what structural change in the pharmaceutical industry caused profit incentive to skew away from public benefit and made Merck’s idealism seem laughably naïve.
In September 2013, the U.S. Center for Disease Control and Prevention published its first-ever comprehensive assessment of the threat posed by known antibiotic-resistant ‘superbugs.’ At a press event held for the release, CDC Director Tom Friedan underscored the report’s urgency with an ominous warning: “If we are not careful, we will soon be in a post-antibiotic era, and, in fact, for some patients and for some microbes, we are already there” (CDC 2013). ‘Superbugs,’ the mutated breeds of drug-proof organisms that rise from the ashes when antibiotics do not fully destroy their target, now cause about 2 million illnesses and 23,000 deaths in the United States every year, according to the report. The problem Friedan is referring to should not come as a total surprise; scientists and doctors have been warning the public of this eventuality since antibiotics were first introduced. Upon receiving the Nobel Prize in 1945 for the discovery of penicillin, Alexander Fleming remarked in his acceptance speech that without proper vigilance over the drug’s administration, his contribution to modern medicine could be fleeting. Patients not following through with dosages, doctors misprescribing or overprescribing, and the widespread use of antibiotics on livestock have all quickened the almost inevitable evolution of resistant strains. With incidence rates and death counts of superbug illnesses on the rise, medical experts are increasingly contemplating the grim prospect of a world without antibiotics. Health conditions in such a world would likely resemble those of the world before the advent of antibiotics when tuberculosis, pneumonia, and gastrointestinal infections were the three leading causes of death in the U.S., five in 1,000 women died in childbirth, and simple surgical procedures were often a matter of life or death (CDC). This outcome is a worse-case scenario, but if new drugs are not able to keep up with the spread of superbugs, it could become a reality.
Yet to make matters worse, the once-steady pipeline of new antibiotics has all but dried up in the last 15 years. Only eight new antibiotics have been approved since 1999 and just one since 2009. In comparison, the 15-year period leading up to 1999 brought 33 new antibiotics into the market, 11 between 1995 and 1999 alone (FDA data). It seems the pharmaceutical industry now lacks either the incentive or the capability to continue creating these drugs. Many pharmaceutical giants “have greatly curtailed, wholly eliminated, or spun off their antibacterial research” in the last decade, reversing an early 1990s trend towards ramping up the area to combat emerging antibacterial resistance (Projan 2003). There are three commonly cited explanations for this downturn. One is scientific difficulty; drugmakers claim antibiotic research is particularly complex now that “low-hanging fruit” drugs have already been developed (Kraus 2008). Another is lack of payoff; the chances of an antibiotic turning a profit are slim because of shorter treatment duration and the risk that a superbug will render it null (Projan 2003). And third, FDA-mandated clinical trials are enormously challenging to arrange for antibiotics, especially when dealing with a medication targeting antibacterial resistance. These patients will have most likely been treated with interfering antibiotics before resistance was recorded. Until very recently, the FDA gave very little regulatory leeway to antibiotics and saw tighter standards as a way of containing their widespread overuse (Nathan 2004). All three factors add extra costs or risks to the research and development process and diminish whatever initial economic incentive there may have been to produce antibiotics.
This dilemma is not entirely confined to antibacterial development. Rather, it is characteristic of trends in the pharmaceutical industry as a whole, an industry many consider to be in the throes of an ongoing crisis. As R&D costs have grown exponentially, the annual output of new molecular entities (NMEs), or drugs with newly discovered active ingredients, has stalled over the last 15 years (Munos 2009). Meanwhile, generic bioequivalent medications now take up a bulk of the market, while branded companies focus on tweaking existing drugs to drag out their patents and pumping up sales of longstanding blockbusters. Evidence of this practice can be seen in the industry’s “Freshness Scale,” a measure created by pharmaceutical industry expert Bernard Munos that shows that 52 percent of total pharmaceutical sales in 2012 came from drugs approved before 2001 and only 10 percent from those approved since 2007 (Munos 2013). Additionally, between 1996 and 2006, the portion of industry revenues generated by blockbuster drugs increased from 12 percent to about 50 percent, suggesting a drop in the variety of revenue-generating drugs (CBO 2009). The heavy reliance on older drugs and evergreen patents indicates a sluggishness of real growth in the industry since the late 1990s, when the number of NMEs approved peaked before sloping off into its current stagnation.
Industry experts have speculated endlessly on what types of barriers are holding back pharmaceutical innovation. Here again, the “low hanging fruit” theory is probably the simplest way to explain away the lag, though not necessarily the most valid (Cockburn and Henderson 2006). Another factor is the growing regulatory burden and enormous rate of increase in R&D costs that disincentivize the necessary level of risk-taking required for innovation (Munos 2009). Munos (2009) estimates that the cost of clinical testing doubles approximately every 8.5 years, and the average cost of R&D for an NME is approximately $1.3 billion. Studies also examine how changes in the industry’s structure and corporate culture may inhibit innovation. Comanor and Scherer (2013) show how consolidation due to mergers and acquisitions counteracts the efficiency of competing research paths, and Munos (2009) cites the growing size of corporations as a damper to productive research. Munos (2009) also points to executives’ corporate finance mindset, hierarchical organization, and lack of scientific or creative awareness as key structural flaws: “The most damaging legacy of the process-minded CEOs who brought us the innovation crisis has been to purge disrupters from the ranks of pharma.” The common thread to these explanations seems to be that pharmaceutical R&D departments lack some essential measure of creative freedom and risk-taking ability, whether because of pressure to beat enormous costs or stuffy corporate bureaucracy. Because of this, the companies’ efforts to revamp innovation have been misguided.
Still, some say that to refer to the situation as an “innovation crisis” is overstating the issue. After all, for an industry that relies on yields from unpredictable scientific research and a revenue model so volatile that only about one third of approved drugs actually pay back their cost (Grabowski et al. 2002), is it reasonable to expect pharmaceutical companies to maintain a level of productivity consistent with that of the industry’s most successful years? It is also true that NME output today, though stagnant, is not that much lower than it has been for many spans of time within the past 50 years (Schmidt and Smith 2005). What makes the situation a crisis, however, is not the constant rate of new drug approvals alone, but the staggering decrease in output per research dollar that has occurred simultaneously. Between 2000 and 2011 the aggregate R&D expenditures of companies belonging to PhRMA, the pharmaceutical manufacturer trade group, increased by $22 billion (PhRMA 2011). Twenty-seven NMEs were approved in 2000 and 30 in 2011. The average cost of developing a NME from the discovery of the new molecule to the final FDA approval is estimated to be approximately $1.1 billion with a timeframe of 10 to 12 years (PhRMA, CBO). Similarly, the likelihood of a molecule making it through each of the three phases of clinical trials to reach the market seems to have dropped from DiMassi’s estimate of 21.5 percent in 2003 to about 11.5 percent (Munos 2013). Clearly, the scientific discovery that is happening is not affected by increases in the amount of funding poured into it. So while the lack of growth in NMEs might look like more of a natural lull than a nascent crisis in numbers alone, the real impact is the skyrocketing cost of each NME and its implications for pharmaceutical innovation.
As the cost of research per NME rises, it presumably forces drug companies to re-evaluate what types of drugs are worth investing in, since the product will now need a much higher profit margin to pay off the initial investment. While a high degree of uncertainty about future revenue streams is inherent to the R&D process, firms do have a general idea of what characteristics of a drug tend to yield steadier profits. In general, medications for chronic diseases, “where patients can look forward to months, even years, of treatment,” and diseases affecting a broad spectrum of the population tend to make safer investments (Projan 2003). Medications focused on acute, or short-term conditions, especially antibiotics would be the first to fall by the wayside if companies decided to cut down on risk in their R&D portfolio (Nathan 2004). This potential for reorganizing allocations illustrates why rising research costs pose such a grave threat to innovation. Conceivably, the greater the cost of bringing a drug to the market becomes, the more a company will reallocate its R&D funding towards chronic diseases that affect a larger portion of the population and away from acute conditions and rarer diseases. Since these considerations do not take into account any measure of public need for a drug beyond that necessary to estimate profits, it is likely that this shift would leave substantial gaps where severe problems could be solved through new drugs but not enough profit potential would exist to prompt their development, a key example being the desperate need for antibiotics. More broadly, the argument could be made that if firms only consider the Net Present Value of investment in decisions, the growing cost of research directly leads to increasing misalignment between social welfare and profit maximization, since it narrows the range of drugs available and ignores potential for innovation in certain therapeutic areas.
The antibiotics problem seems to be finding traction in the medical community and in the last couple years, a few reforms have been put into place that could have important effects. In 2012, Congress passed a law called Generating Antibiotics Incentives Now (GAIN), which grants antibacterials five years of protection from biosimilars and expedites regulation. The FDA also recently decided to relax its stringent requirements for antibiotics and put it into place its own incentives program. The success of these measures remains to be seen. The antibiotics problem is rooted in the structure of the industry and overcoming it may take a broader overhaul in the market forces governing these drugs.
Cockburn, Iain and Rebecca Henderson (2001). The Scale and scope in drug development: unpacking the advantages of size in pharmaceutical research. Journal of Health Economics 20, 1033-1057.
Comanor, W.S. and F.M. Scherer (2013). Mergers and innovation in the pharmaceutical industry. Journal of Health Economics 32, 106-113.
Congressional Budget Office (200). Pharmaceutical R&D and the Evolving Market for Prescription Drugs. Washington, D.C.
DiMasi, J.A., Grabowski, H.G. (2007) The cost of biopharmaceutical R&D. Manage, Decision. Econ. 28, 469-479.
Esther F. Schmid, Dennis A. Smith (2005). Keynote review: Is declining innovation in the pharmaceutical industry a myth?, Drug Discovery Today, Volume 10, Issue 15, 1 August 2005, 1031-1039,
Munos, Bernard (2009). Lessons from 60 years of pharmaceutical innovation. Drug Discovery 8, 959-968.
Nathan, Carl (2004). Antibiotics at a Crossroads. Nature 431, 899-902\ PhRMA. Pharmaceutical Industry Profile 2012. www.phrma.org/files.
Projan, Steven (2003). Why is big Pharma getting out of antibacterial drug discovery?, Current Opinion in Microbiology, Volume 6, Issue 5, October 2003, 427-430.