Sharing the benefits of biomedical science

This essay was written by Michael Sargent and was first published in the 2005 Mill Hill Essays.

“A man, who dies rich, dies disgraced”. This arresting sentiment guided Andrew Carnegie, the American steel plutocrat to disburse his entire fortune in philanthropic work. It also greatly influenced the man who would become the greatest patron of biomedical science, John D Rockefeller. This son of a dealer in fake cancer cures and arch-conservative, whose fabled wealth originated in notorious chicanery in the oil industry, was one of the unlikeliest philanthropists. But, with Carnegie’s words firmly in mind, he used his loot “to promote the wellbeing of mankind throughout the world” and incidentally set biomedical science on its meteoric career. In a revolutionary outburst of philanthropy, that culminated in 1913 in the establishment of the Rockefeller Foundation, he created vehicles dedicated to addressing what he believed was the world’s greatest social issue — poor health. The new Foundation had an all-embracing range — public health infrastructure, fundamental biomedical research, tropical diseases, population policy and the genetic improvement of staple crops. Huge grants were made to improve medical education and public health provision throughout the world. Particular largesse was focused on London as a strategic centre for promoting health amongst the impoverished millions of the British Empire. The Foundation set a salutary example to another generation of philanthropists and to governments, contributing to many notable biomedical achievements, including 137 Nobel Prizes.

After the Second World War, a golden age of pharmaceutical chemistry dawned in which an American dominated industry began to make products that met the historic needs Rockefeller and his advisors so keenly recognised. Impressively effective medicines including antibiotics, anti-inflammatory steroids, contraceptive pills, tranquilizers and anti-depressants, agents of chemotherapy, immunosuppressives, heart drugs and vaccines revolutionised every aspect of health care. No enterprise could have earned more kudos from the public. Pharmaceutical products were perceived as amongst the most outstanding achievements of the century and leaders of the industry enjoyed representing their activities as philanthropic missions. George Merck II used to boast, “we never forget that medicine is for the people, not for profits. The profits follow, if we remember that”. Today, such remarks provoke ironic laughter but in the 1950s, pricing was never an important issue. The real cost of bringing an antibiotic to market was perhaps only one thousandth of present day costs and was kept low by intense competition. National Health Services in much of the developed world could purchase new drugs as they became available, for the benefit of all their citizens.

The pharmaceutical industry soon became the most profitable business sector in the American economy with its peculiar trademark language of ‘blockbusters’ and ‘drug pipelines’ and the aggressive salesmanship that contributes so much to its business strategy. Today, the industry still claims its prices fairly reflect the costs of research and the inherent risks but some financial analysts are less certain. They believe that more than half the cost of developing a drug in America is attributed to fierce promotional activity in which about one sales representative exists for every five doctors. Profitability is also greatly enhanced by the patent system that enables the owner to have a monopolistic control of prices for a period in which they can recoup the costs of discovery and more. Profitability is further sustained by company mergers that tend to reduce competition. In addition to legitimate aids to profitability, illegal price-fixing cartels are commonplace for drugs not protected by patents.

The dynamism of pharmaceutical innovation in America was initially driven by university-based biomedical research, funded by the National Institutes of Health (NIH) since the 1940s. Some critics felt the financial interest NIH retained in the outcome of research was a disincentive for the development of new products by industry. By the 1970s, these critics were calling for patent and licence fees to be transferred entirely to the organisations that make discoveries, to entice them into commercialising their discoveries more effectively. With the Bayh-Dole Act of 1980, this idea became law. The ensuing licensing frenzy was greatly stimulated by the birth of the gene technology industry and the multitude of start-up companies and stock market flotations. The American Orphan Drug Act of 1983 provided another stimulus for industry, this time to encourage the development of treatments for rare ‘orphan’ diseases for which there was no existing commercial incentive. This gave companies a seven-year period of exclusivity if they sponsored drugs through the Food and Drug Administration (FDA) approval process. Most protein medicines made by gene technology qualified for this kind of support. Apparently to everyone’s surprise, the first profitable outcomes were from drugs such as erythropoietin (a protein factor that stimulates red blood cell formation) and growth hormone. Undoubtedly their commercial success was in part underpinned by clandestine sales for use in enhancing athletic performance. Development of protein medicines, including monoclonal antibodies, is a longer and more expensive process than developing antibiotics with greater risks of late-stage failure. Nonetheless, effective medicines of this type have emerged that promise to change the prospects of patients who suffer from a range of chronic and often fatal diseases including leukaemia, lymphoma, breast cancer and multiple sclerosis. Herceptin, the breast cancer drug developed by Genentech in America is a notable example of the problems new therapies may provoke. The clinical news about this drug appears extraordinarily promising for patients with an appropriate genetic profile and is already stimulating enormous interest despite the cost — perhaps £20,000 for a course. National health services obviously require unequivocal independent evidence of efficacy and some idea of how such treatments will be financed before making the drug available on demand. There is little doubt a similar furore will precede the introduction of every new treatment for serious chronic diseases for the foreseeable future.

Opinion is sharply divided about the Bayh-Dole legislation. Some hail it as a hugely liberating measure that has massively stimulated biomedical creativity. Others see it as a vehicle for profiteering from drugs that are perceived as the rightful property of the American public, developed in tax-payer-financed laboratories. One Congressman claims that the cancer drug Taxol has made nine billion dollars for its manufacturer, Bristol-Myers-Squibb (BMS), under its brand name while the National Cancer Institute (NCI) received “only a paltry 35 million dollars”. The development costs of Taxol for BMS and NCI were respectively, one and one half a billion dollars. The cunning of the industry was widely discussed in the context of therapies for AIDS (acquired immune deficiency syndrome). The drug company Burroughs-Wellcome astutely applied for a patent to use an old drug (AZT — first made by the NCI in 1964) for the new disease within a year of the first reports of the syndrome (1980). Three years later, the drug was on sale as Zidovudine, at $8,000 for a year-long course of treatment. By 1988 the triple therapy – an anti-HIV therapy based on three drugs that could reduce mortality – was available in the developed world. However, its enormous cost and the medical care required for monitoring patients put this solution quite beyond the resources of the countries of sub-Saharan Africa during the period when the epidemic was spreading rapidly in that region. The Mumbai-based company, Cipla, claims it can supply the triple therapy for $304 per annum. Even so, this price is too great for 95 percent of the infected population of the region. Hope, for Africans (and presumably Cipla) is now invested in philanthropists and international organisations who may pay for the therapies.

For the last three decades, a relentless pursuit of profit has meant that major pharmaceutical companies have almost eliminated research programs directed at diseases of the developing world. By the 1990s, international health officials were referring to the 90:10 gap to illustrate how ninety percent of the world’s biomedical research efforts benefit just ten percent of the people. Médecins sans Frontières, the leading non-governmental medical aid organisation, reported a standstill in the search for drugs effective against thirteen neglected diseases rife in the developing world and that these diseases were exacerbating poverty. Coincidentally, just thirteen new drugs were developed for tropical diseases between 1975 and 1999. Stung by these findings, editors of biomedical publications frequently urge the science community to do better, and governments to offer financial incentives to lure companies into research on neglected diseases. The tax breaks that make rare ‘orphan diseases’ such worthwhile targets for research in America are an inadequate inducement for research into neglected tropical diseases because those who suffer from them are usually unable to pay for a novel therapy. Demand for new kinds of antibiotics remains high, but major pharmaceutical companies such as Roche and Eli Lilly are abandoning this kind of research because they perceive it as less profitable than therapies for chronic diseases such as cancer. There are indications that smaller companies may take up the challenge (perhaps in Singapore) but at a time when drug resistant microbes are a growing threat, it seems short sighted if not reckless to abandon the search for new antibiotics.

A recent analysis of the problem of neglected diseases by Mary Morgan of the London School of Economics, suggests a promising new paradigm is emerging. At least 63 new drugs (not including vaccines) for neglected diseases are in development in so called ‘Public-Private Partnerships’ (PPP). These are collaborative enterprises based in academic centres, run on a not-for-profit basis, financed by private philanthropy and industry. Morgan identifies four big multinational companies that contribute on a non-commercial basis. She suggests three motivations: the need for a plausible socially responsible contribution to the developing world, the need for a strategic foothold in these markets, and the need for a pool of low-cost researchers. PPPs facilitate the participation of major companies in the creative stage of drug development, without committing shareholders to reduced revenues. The next stage, expensive clinical trials, will probably be undertaken at an appropriate time with the aid of other PPPs. Evidently small companies find commercial opportunities in neglected diseases that multinationals ignore. The oral antileishmaniasis drug miltefosine, for example, was developed and registered by the small German company, Zentaris, with the help of academics and the WHO (World Health Organisation). PPPs co-ordinate and integrate the early stages of development of compounds from many different sources that no one company would pursue, at relatively low cost. Morgan lists examples of multinational collaborative ventures made under the umbrella of the ‘TB Alliance’ and ‘Medicine for Malaria Venture’ involving universities in the developed and developing world. The LSE study suggests that the optimal process involves both industrial and academic expertise. The report identifies three successful therapies developed using the PPP approach against tropical parasite diseases: ivermectin, praziquantel and coartem. Ivermectin has halved the burden of river blindness (onchocerciasis) between 1990 and 2000; praziquantel is used to control schistosomiasis; coartem is a safe effective antimalarial for children, based on the Chinese anti-malarial Artemisin. In comparison, the thirteen clinically proven medicines listed by Médecins sans Frontières are rarely useful in developing countries because of their high price or poor suitability for local conditions. Another promising sign is that Brazil, Egypt and India, countries beset by ‘neglected diseases’, now have the infrastructure to conduct their own research and to develop their own therapies, including vaccines. A bigger group of countries are an important source of cheap copies of patented drugs, for which they risk legal retribution because they infringe World Trade Organisation rules. Colombia and Guatemala, for example, are guilty of the heinous crime of manufacturing the antibiotic Ciprofloxacin for five cents a tablet for their own use compared with $3.40 in the United States. Several international organisations are adamant that ‘intellectual property rights’ should play second fiddle to humanitarian needs but accept that such arrangements should not jeopardise international markets nor should the quality of the products be second rate.

The viability of the PPPs is a pressing question. The answer is far from clear, but PPPs have successfully attracted billions of dollars from the Gates and Rockefeller Foundations. They are also contemplating manufacturing the drugs they pioneered on an industrial scale and are likely to insist on setting profits at no more than ten percent of the price, rather than the high margins preferred by multinationals.

Clinical therapies that solve acute health problems tend to overshadow preventative medicine. However, much of the improvement in life expectancy in all countries in the last century is attributable to public health management and local self-help activities such as improvements in mother craft. Ambitious efforts to share the benefits of biomedical science with the developing world began after the Second World War, organised by international institutions such as the WHO to control malaria, smallpox and other infectious diseases. At the same time, steps were taken to guide countries towards better population and food policies.

The most radical initiative ever contemplated in international public health — the eradication of smallpox throughout the world using a vigorous vaccination policy — was proposed by the WHO in 1953. It seemed impossibly utopian to some, but within five years China and the Soviet Union had eliminated the disease from their territories using their own resources. Almost shamed by this example, the WHO felt compelled to eradicate the virus from the rest of the planet, a policy that evidently succeeded as the very last case occurred in Somalia, in 1977. The WHO now hopes vaccination programs will be used to eradicate other infectious diseases and indeed, eradication of polio is almost complete. An American–financed program of comparable ambition, led by the WHO, was mounted at the same time to control malaria by eradicating mosquitoes using DDT. The incidence of malaria was spectacularly reduced in India and Sri Lanka, and many islands — Japan, Taiwan, Zanzibar, Cyprus, Singapore and Jamaica — were permanently freed of malaria. However, the campaign was abandoned as fears for ecosystems and unacceptable health risks began to weigh heavily on the organisation. Sadly, although great victories were won, they were incomplete: mosquitoes reappeared in just a few years, ominously resistant to DDT although free of malaria. Meanwhile a big challenge was developing from the advance of malaria resistant to chloroquine, the main anti-malarial drug at that time. The Roll Back Malaria campaign launched in 1998 with support from many sources made a pledge to halve deaths from malaria by 2010. The scheme was predicated on three strategies, insecticide treated bed nets, prompt and effective treatment and preventative treatment during pregnancy. Each element of the campaign seems to be effective but the logistics of reaching all at risk makes success seem virtually unattainable.

Major international programs for disease surveillance have been in place for many years, to provide collective security against deadly epidemics that usually fall with disproportionate harshness on the developing world. Influenza, an airborne disease with the capacity to spread with devastating ferocity through unprotected populations, is the focus of one important program. International infectious disease surveillance systems were put to an unexpected and important test when SARS (Severe Acute Respiratory Syndrome) emerged in the Far East in late 2003. The outbreak was alarming and tragic, but the system performed impressively, setting new standards for dealing with a novel disease with practical scientific rigour.

Since 1952, the Ford Foundation and other organisations have financed the Population Council (set up with Rockefeller support) which helps the governments of developing countries to reduce population growth through family planning by means of its huge donations. For a time, the American government was active in the field but since the Reagan presidency this has declined and the Foundations have again taken the lead. India and China have always given population policy a high priority and have used their own resources very effectively.

Since the 1950s and continuing to the present day, the WHO and other international bodies have facilitated the introduction of medicines of proven efficacy to the developing world. The WHO maintains a list of ‘essential drugs’ — now numbering more than 300 — that are necessary wherever ‘western’ medicine is practised. They include antibiotics, painkillers, antiinflammatories, anaesthetics, insulin, heart drugs and many others. They are usually cheap because they are mostly not protected by patents. Médecins sans Frontières claims that about one third of the world’s population lack access to them because of the precarious governance of certain countries. They permit inappropriately high import taxes, lack skill in purchasing, permit inappropriate prescribing and lack adequate quality control to the extent that fake substitutes are sometimes in use.

While medicines are vital for managing ongoing disease, the history of every developed country since the nineteenth century tells us that high life expectancy and good health was achieved by a host of social interventions. Developing countries must recapitulate this evolutionary process if they are to achieve the same situation. In recognition of this, world leaders marked the 2000 millennium with a resonant pledge to free the world from the “abject and dehumanising conditions of extreme poverty” — by 2015 — through eight ‘Millennium Development Goals’. Nobody could disagree in principle with a process that reduced poverty, enrolled every child in primary school, reduced infant and maternal mortality and eliminated infectious disease. By May 2005 it was clear that these goals were being missed by a wide margin in Africa. Indeed, in ten African countries infant mortality — one of the key statistics — is now worse than in 2000. Well-informed organs such as The Economist regard with suspicion a scheme that sets precise quantitative targets without a careful calculation of what might be feasible for individual countries. They are sceptical, too, that any of the statistics can be determined with any accuracy. Nobody can be optimistic for the ‘Millennium Goals’ when the signatories — with remarkable hypocrisy — are unable to give the whole-hearted financial support the scheme needs.

We have come a long way in the developed world since John D Rockefeller committed his fortune to improving the wellbeing of humankind. Within the developed world we have learnt to share the benefits of biomedical science to the extent that our health needs are generally paid for jointly, if not entirely equitably, through contributions to National Health Services or insurance schemes. As financial instruments, these generate incentives for biomedical innovation that Rockefeller the capitalist never seemed to anticipate. However, for most of the developing world self-priming arrangements of such sophistication are not realistic options; many health challenges of that world remain a moral obligation to the developed world that has not changed since the time of Rockefeller. We seem to have found a way in which the interested parties do what they are best at. Pharmaceutical giants make drugs and profits on a big scale. Philanthropists, international aid organisations and governments find ways to overcome the financial barriers that impede distribution of drugs and discovery of new therapies. The not-for-profit company is emerging as a new business model that might focus pharmaceutical creativity on the common neglected diseases of the developing world and the rare congenital disorders of the industrial world.

Leave a comment


email* (not published)