Citizenship and the medical sciences

This essay was written by Michael Sargent and was first published in the 2003 Mill Hill Essays.

Every civilisation that called their people “citizens” invested them with certain rights to participate in self-rule by making choices in return for specific civic obligations, most notably in requirements to pay taxes or to bear arms. In the early twentieth century, the public began willingly to accept new kinds of obligation, originating in science that promised improvements in the quality of life. Shaping these developments were important new insights into the cause of infectious disease, understanding of vaccination, nutrition and the principles governing how blood and other tissues could be donated from one individual to another. Later still, a revolutionary discovery in reproductive physiology made planned pregnancy possible.

In Britain in 1900, one in seven babies died in infancy, one in two hundred mothers died in childbirth and life expectancy for men and women was under fifty years. Rampant infectious disease – tuberculosis, diphtheria, whooping cough and scarlet fever – prematurely terminated many lives. Popular tradition often attributes the defeat of infectious disease to antibiotics and immunisation deployed in the 1950s, but in reality the prevalence of infectious disease was already on a steep downward path in 1900. As the germ theory of infection gained acceptance amongst scientists, the public quickly became enthusiastic about hygiene. “Germs” entered the English language as the embodiment of a public enemy that needed to be eradicated, while the advertising industry in the first flush of youth encouraged hand washing, the use of soap, disinfectants and handkerchiefs. Meanwhile a host of official interventions, largely inspired by the germ theory, began to reduce the threat of infectious disease.

The great mid-Victorian innovations of closed sewers, piped microbe-free water and Medical Officers of Health (MoH) predated the germ theory. These had already dramatically reduced the risk of water borne epidemics and put sanitary regulation at the heart of public administration. However, the role of the MoH was crucially shaped by the germ theory when, in 1889, Parliament invested them with authority to control the spread of infectious diseases by the only weapons available, notification and isolation. Householders or persons attending the sick were required to notify the MoH of every instance of communicable disease from typhus to measles and the MoH was empowered to isolate infected individuals in fever hospitals. The recognition of tuberculosis as a bacterial disease transmitted in water droplets in air, focussed attention on how infection might be spread; through overcrowded homes to the time-honoured custom of spitting. Tuberculosis patients were increasingly isolated from the community, in sanatoria and other establishments, minimising their capacity to infect their families and the public. Nothing, except perhaps better nutrition, could arrest clinical infections once started but between 1840 and the introduction of antibiotic therapy in 1950, deaths from tuberculosis fell by ninety percent because of reduced transmission of the disease.

Food poisoning also blighted Victorian Britain. Unhygienically collected milk carried many kinds of microbe including Mycobacterium bovis, the probable cause of intestinal tuberculosis and many child deaths. Diarrhoeal diseases were spread by flies that thrived in Victorian cities on rubbish and horse droppings but were eventually greatly reduced once municipal refuse collections and dustbins were installed. Those who prepared food for the public were compelled by legislation to recognise their responsibilities and ensure food was free of infection. Health Visitors – a new profession created by legislation in 1905 – contributed hugely to improved survival and health of children, by introducing to socially disadvantaged families, the skills of hygienic mother-craft. These salutary achievements notwithstanding, the work of the pioneers of public health remains unfinished. In the last four decades more than ten new infectious diseases have been discovered, each pointing to another chink in our hygienic armour. Reported cases of food poisoning and sexually transmitted disease are rising inexorably. The emergence of AIDS – an incurable and potentially fatal disease – has made unsafe sexual practices and intravenous drug use an extremely dangerous affront to responsible citizenship. While health officials can warn and usually cure, zealous regard for the rules of hygiene remains the best protection against avoidable infection.

Jenner’s famous demonstration of the power of vaccination to prevent smallpox in 1796, inspired several European states to institute compulsory vaccination against this most dangerous and infectious of diseases. Within a decade, the value of vaccination for protecting an entire population was established. In Britain, vaccination was free on demand, but remained voluntary; a life-saving gift to the affluent, ignored by the poor. Eventually, in 1853, prompted by unnecessary deaths from the disease, Parliament made vaccination for infants compulsory. Working class apathy promptly changed to defiant hostility in some quarters and persisted until compulsion was repealed in 1909, although the policy was sufficiently effective to make smallpox uncommon in Britain. After the 1914-18 war, the idea of eradicating other serious childhood diseases by mass-immunisation gripped the imagination of public health authorities. Diphtheria, a disease that terrorised poor neighbourhoods everywhere, was the first target of this enthusiasm and New York’s Health Department was the first organisation to apply scientific medicine in a public health crisis. Using a novel publicity drive, they persuaded the public of the safety and efficacy of the vaccine and within a few years the death rate from the disease was halved. By the 1950s, as the success of this and other vaccines became apparent, British parents generally allowed their children to be immunised against childhood diseases. The small risk of adverse effects associated with vaccination was accepted in view of the bigger risk associated with the diseases and the greater good of protection for the community. However, remembering the difficulties with smallpox, British governments have refrained from making vaccinations compulsory, as in some countries.

The third cornerstone of improved health in the early twentieth century was a rational understanding of nutrition. Research indicated a satisfactory diet must have a minimum calorie content, a balanced mix of protein, carbohydrate and fat plus vitamins and minerals. The political establishment of 1930s Britain, when food availability was better than any previous era, were surprised by the results from the first serious nutritional surveys. About one in five British children and one in ten adults were chronically undernourished and about one half of the population was deficient in at least one specific nutrient. Through the efforts of health authorities in the 1940s and 50s, greatly helped by steadily rising incomes and fortification of staple foods with vitamins, the standard of nutrition improved. Today, few people are ignorant of the dietary requirements to avoid malnutrition but doctors still occasionally encounter vitamin deficiency amongst their patients.

A recognisably modern civilian blood transfusion system started in London in 1921 when public-spirited, unpaid volunteers donated blood for the first time and made modern trauma-care and surgery possible. In the 1970s, the idea of tissue donation was extended to bone marrow and blood platelets (the curious cell fragments in blood that initiate blood clotting) facilitating more innovative life-saving procedures.

The relentless reproduction of people barely able to support their families brought another dimension to the misery caused by infection and malnutrition in Victorian society. Contemporary records indicate infanticide and death-by-neglect in “baby farms” was the fate of countless infants. Former “Hospitals for Foundlings” still exist in many European cities; poignant witnesses to the scale of unwanted pregnancy. The British birth rate peaked in 1870, and then fell, marking the start of an era in which the effort to control fertility would gradually gather momentum, culminating in the 1950s in the discovery of safe, effective hormonal contraception. At a time when the world’s population was doubling every forty years, this signalled unequivocally that unwanted pregnancy was avoidable. In most of the developed world, the birth rate began its historic decline to little more than replacement level. The place of planned parenthood in responsible citizenship in Britain today is still far from assured, when for every three live births there is one abortion.

The health-science profession has guided the public towards accepting responsibility for hygiene, immunisation, nutrition and reproduction with relatively little controversy for a large part of the twentieth century. As threats to human health have receded and new objectives for the medical sciences have surfaced, this relationship has changed. Highly innovative ideas, such as the contraceptive pill and in-vitro fertilisation, were adopted democratically once their safety and utility was established, to the chagrin of those traditionalists who claimed to be the conscience of society. Different voices now claim society is being bamboozled into accepting genetic and reproductive innovations that ought to be unwelcome. While the consequences of these ideas to human biology are still uncertain and the challenges to the ethical status quo still incompletely resolved, anxiety is clearly justifiable. The media invites discussion as never before, the public clearly wants more information and, indeed, wants to participate in self-rule by making choices in these matters.

Perceptions of recent biomedical landmarks are often associated with names of individuals. Louise Brown, the first “test-tube baby”; Adam Nash, the first “spare parts baby”; Kim Cotton, the first surrogate mother are just a few. The press – viewed charitably – excite discussion with these cheap canards and focus attention on the human dimension of these developments. The important ethical issues have been considered more formally since 1991 by the Nuffield Council on Bioethics, an independent body composed of experts and lay people. Other bodies regulate issues arising from genetics and reproductive biology, directly, by ruling on specific proposals, formulating guidelines for policy and responding to public opinion. The Human Fertilisation and Embryology Authority is responsible for all initiatives related to non-conventional methods of conception. The Human Genetics Commission is similarly charged with protecting the public’s interest and will be a crucial forum for debate, with reports on genetic testing and its implications for the insurance industry, already in preparation. Independent regulatory bodies and consumer law governs the drug and food industry and many committees are set up by Parliament and the Royal Society to review and consult about particular issues.

With all these public bodies overseeing the way science contributes to human welfare, should we worry? At a time when our life expectancy is longer than at any time in history, alarming stories are easily found that forecast the end of this happy situation. Indeed, a diverse menagerie of health newshounds that span irreproachable watchdogs and the wildest and most irresponsible of jackals provide extraordinary amounts of information. Whose opinion should we trust? Can we rely on expert opinion given to public bodies? Sober-sounding advocacy groups with strong opinions are particularly difficult to interpret. The British Medical Association advises readers confronted by a scandalous report to ask themselves three questions: Is the content highly emotive? Does it suggest a conspiracy? Does it rely on unpublished sources? They suggest that if the answer to any of these is yes, then the material must be treated with caution, at the very least.

Scientists of all kinds have a long-established duty in all health-related matters to make their investigations objective and reliable, based on honestly and accurately collected data. This information may be used to discriminate between alternative hypotheses and should be published after a peer-review process designed to establish the work was done to professional standards. The credibility of the findings are then enormously strengthened if corroborated by independent investigations. In the aftermath of a crisis or following potentially revolutionary innovations, relevant scientific work is likely to be interesting to the public and it needs to understand what is being said and which protagonists it can trust. In serious issues, with a high degree of personal relevance, a majority of the public should be able to understand the choices being made. But is this unrealistic?

The BSE (bovine spongiform encephalopathy) epidemic and the emergence of nvCJD – new variant Creutzfeldt-Jacob Disease, a fatal neurological disease of humans – critically tested the relationship of scientists, government and the public. BSE, a disease of cattle that appeared for the first time in the 1980s, is caused by a strange protein-only infectious entity, known as a prion, related to a natural cellular protein present in all mammals. The immediate cause of the epidemic was probably a protein-containing food supplement fed to cattle as a growth stimulant that probably contained meat obtained from recycled infected animals. The practice of using meat as a growth stimulant for cattle started in 1926 with no apparent deleterious effects, but the prion probably underwent a critical change that allowed it to infect cattle in the 1970s. This mutation probably occurred in cattle but could have occurred originally in a prion derived from sheep meat infected with scrapie agent, a prion disease of sheep known in Britain for centuries. Sadly, farmers continued to use compromised feed even after suspicions were aroused and the fear that “scare-mongering” might damage the beef export business made the government reluctant to specify the risks, inevitably damaging public confidence greatly.

The crisis deepened because professional expertise was not translated into policy with any urgency. An investigation by the National Audit Office of this and other interactions between the science community and government policy makers, frankly exposed shortcomings on both sides. Scientists, the report concluded were “insufficiently motivated to engage in solving problems”, “did not adequately focus their research on matters relevant to policy” or “communicate their specialist knowledge effectively”. Policy makers, the report alleged, “were insufficiently interested in the relevant research” and found the “technicalities and inherent uncertainties of research difficult to grasp”. The need to make scientific thinking an organic part of government could not be clearer. In parallel, the House of Lords report on Science and Society in 2000, explicitly requested that scientists and science policy should address the public’s anxieties in “direct, open and timely dialogues”.

The problem of conveying risk to the public was apparent in the BSE crisis and many other controversies but remains an unresolved problem. Significant increases in risk of illness or even of reduced life expectancy associated with certain activities or behaviours create bewilderingly variable responses. Heavy smokers warned in the most draconian language – a one in ten chance of contracting lung cancer – ignore the news, while tiny risks of contracting a serious disease such as nvCJD, can provoke extreme anxiety in others.

What about a lone and critical voice disseminating unwelcome news, such as the report by Dr Andrew Wakefield, claiming vaccination of infants with MMR (the mumps, measles and rubella combined-vaccine) was linked with the onset of autism? Extensive analyses of much larger samples gave no support for Wakefield’s hypothesis but nonetheless, the vaccine is still described, incorrectly, as “controversial” by the media. As a consequence, some parents are not permitting their children to be vaccinated and the campaign is being seriously damaged. Credible lone voices rarely remain lonely for long. Fierce competition between different research groups usually quickly confirms or contradicts important health-related claims.

The case of Alice Stewart, the British physician-epidemiologist who died aged 96 in 2002, illustrates how critics of secretive monolithic commercial organisations can be silenced. In the 1950s, she found a single diagnostic X-ray of pregnant women increased the risk of childhood leukaemia by forty-percent. While the medical profession accepted the implication of the published data and took appropriate steps, the American Atomic Energy Commission, variously ignored or vilified her conclusion and insisted that a substantially higher maximum level of radiation was safe. More than two decades later, she was vindicated by a Congressional enquiry and the Commission was forced to admit the permitted exposure was set too high by a factor of ten.

How can future generations be prepared to respond to scientific issues with a social dimension? The National Curriculum has required British children to study the place of science in society at secondary school for some years. Now, the Nuffield Foundation Curriculum Centre has initiated a scheme that focuses specifically on the social impact of science and the choices citizens must make – unlike traditional school science courses designed to prepare students to be scientists. This is an AS level course called “Science for Public Understanding”, intended to attract a wide range of post-16 students. A lucid and carefully thought-out text, edited by Andrew Hunt and Robin Millar, presents factual material essential for informed discussion, admirably and objectively. Let’s hope the scheme will gather momentum and provoke enthusiasm for this approach in a wide range of students.

As we enter an era in which momentous biomedical advances are promised, it is the young who will benefit ultimately or regret they ever happened. Optimists believe novel technology will overcome hitherto intractable problems, including genetic diseases, chronic neurological disorders, cancer and accidental damage (such as spinal injuries that paralyse their victims). Compelling indications from studies of model animal systems suggest these ambitions are possible and may be applicable to humans in several decades.

Pessimists fear the profit motive in biomedical innovation for two reasons. Competition does not seem to be making gene technology based medicines cheaper as it would in other economic sectors. Costly products may need to be “rationed” by National Health Services and a class of specially privileged patients may appear. The second reason is that biological innovations of a dubious medical or ethical character could be promoted for mere commercial gain. Developments of this kind, that any rational person would question, include the cloning of human embryos, germ-line gene therapy, genetic enhancement of human embryos, discriminatory use of genetic tests, pre-natal genetic diagnosis for sex selection or other social reasons, and any promotional campaign that encourages genetic neurosis. While the public rejects them now, one fears that the secret and financially independent commercial world could engineer a change in sentiment. Some prominent American biotechnology companies are already suggesting that the latest frontier of consumerism will be extension of the human life span using medication.

Pessimists and optimists alike must recognise that we are entering uncharted waters, in which all the vigilance we can muster will be required. We can conservatively anticipate a steady improvement in our prospects for health and life expectancy from innovations, but as the age profile of society extends towards greater longevity, important adjustments to our social institutions will be necessary. Above all, we must avoid creating a society of long-lived invalids; biomedicine must remain our servant and not become the master of our fate.

Leave a comment

name*

email* (not published)

website