Show Summary Details

p. 1186. Medicine in the modern worldfree

  • William Bynum

Abstract

‘Medicine in the modern world’ considers the consequences for modern medicine of the topics covered in the earlier chapters, as well as the impact of cost and the implications of a profit-driven pharmaceutical industry. Themes from each chapter are still relevant today. The Hippocratic legacy is evident in the return to holism; the library has evolved into the internet, delivering information on medical advances instantly to anyone; hospitals continue to develop in response to new innovations and pressures; mass trials and questionnaires represent medicine in the community; and the public looks to laboratory medicine to come up with further life-saving drugs.

What happened next?

The first five chapters have been roughly chronological, from Hippocrates to the outbreak of World War I. This chapter deals with the medicine of the past century. In it, we shall look briefly at the current relevance of each of the five ‘kinds’ of medicine: bedside, library, hospital, community, and laboratory. Each has a place within the budgets of modern healthcare and the lives of patients and doctors.

The driving force behind modern medicine has been cost. The most urgent question of medical care of the last generation or two has too often been: Is it affordable? This question crosses national boundaries, and is applicable to tax-paid schemes such as Britain’s National Health Service (NHS), private insurance and fee-based care in the United States, or basic health functions and medical aid in Africa. Health ‘need’, no matter how it is measured, seems infinitely elastic. The more that is available, the greater is the demand. Spiralling medical costs have shaped modern medicine. At the same time, medical effectiveness has increased in ways that even visionaries of the past would not have acknowledged. Thus, concern with efficiency has come to the fore. Medical care has become big business, and has acquired many of the strategies of international corporations. Indeed, many of the suppliers of p. 119medical care are international corporations, driven by profit motives. Business leaders point out that a corporation that provides shoddy or over-priced products will lose out to its competitors. Critics of Modern Medicine, Inc., point out that mending bodies and preventing disease should not be like repairing automobiles or selling toys. There is ongoing debate but few points of agreement.

Bedside medicine: the Hippocratic legacy

Hippocrates remains a much-invoked figure today. Healers of all stripes, from mainstream Western doctors to many kinds of alternative healers, claim him as their founding father. Two interconnected aspects of the Hippocratic image continue to attract: the holism of humoralism, and the importance of the patient.

Holism has once again become a mantra in recent times. Most commentators see it as a reaction to the continued reductionism in modern medical science. First bodies, then organs, then tissues, then cells, now molecules. We have institutes of molecular medicine, just as 19th-century German universities created institutes of physiology, bacteriology, or pathology. Looked at dispassionately (people are rarely dispassionate about their health or healthcare), molecular medicine simply represents the culmination of a trend that had motivated doctors since at least the 17th century to push back the level of analysis of disease. It is part and parcel of what can legitimately be described as the progress of medicine and medical science.

This constant aim at ever lower levels of analysis has not met with universal approval, even among medical practitioners. The feeling that ‘we murder to dissect’ has been around longer than the author of the sentence, the Romantic poet William Wordsworth (1770–1850). The Romantics waged war against the inexorable p. 120analysis of the parts at the expense of the whole, and following the horrors of World War I, and the rapid growth of specialization within medicine, many doctors felt that a new foundation was needed for medicine. The holism movement that developed adopted Hippocrates as its figure-head, and attempted to conceive disease in general terms such as the patient’s constitution. Doctors encouraged their charges to return to nature, to eat simple foods, wear practical clothes (or none: nudism was also part of the movement), and live lives that were attuned to the dictates of nature. The movement attracted a number of famous doctors, especially those suspicious of experimental science and of medical specialization, and resulted in a number of concrete experiments. In Britain, the most famous was the Health Centre at Peckham, South London, opened in 1928. Its founders argued that medicine had for too long emphasized disease, and that the biology of health ought to be its primary concern. It encouraged family life, and for families to come regularly to the centre, to participate in its physical and social activities, not a million miles away from those on offer at the contemporary fitness club.

The holism movement within medicine was never more than a minority voice, and its influence quickly evaporated after World War II, partly because it had been espoused by a number of leading Nazi doctors, and partly because the new range of biologicals and miracle drugs, above all, insulin, penicillin, and cortisone, promised that experimental research might indeed cure all ills. The ‘golden age’ of modern medicine dominated the middle third of the 20th century, and doctors enjoyed an unprecedented era of prestige and trust. Infectious diseases were believed to be more or less conquered, psychiatric disorders were to be controlled by the new thorazine and the other brands of antipsychotic drugs, and cures for cancers were on the horizon.

It is no coincidence that general practice, or family medicine, was at a low ebb during these decades. In Britain, it was assumed that p. 121general practitioners were made up of those not good enough to become consultants in the new NHS, or private consultants in Harley Street. Medical or surgical specialization was the presumed aim of any medical student, for specialists were the elites who ruled the profession.

From the 1960s, things began to change. The Vietnam War sparked a protest generation which was suspicious of all forms of power. At the same time, the attacks on the professions, as cryptic trade unions, concerned with income and freedom to do as their members pleased, began to gather pace. The Austrian social critic Ivan Illich (1926–2002) launched his attack on educationalists, doctors, and other professionals, with doctors creating as much disease (‘iatrogenesis’) as they purported to cure. Illich urged people (not ‘patients’, or even ‘clients’ as they have recently become) to take control of their bodies and health. Illich was only one of a number of counterculture advocates (in Britain, Mrs Thatcher from a right-wing perspective began her own attack on the professions) who forced doctors and other professionals onto the back foot. Doctor–patient relationships began to change, with power shifting in the direction of patients.

Two developments among many can be mentioned as evidence. First, the nature of general practice began to be reformulated. It had always been more concerned with the ‘whole patient’ than had the specialties, and Michael Balint (1896–1970), among others, highlighted how many psychiatric disorders (such as depression, anxiety, insomnia) were being dealt with by general practitioners. Balint was instrumental in the reformulation of family medicine as a vibrant and important aspect of medical care. It became an academic discipline, and gained prestige within the medical hierarchy. The irony that general practice raised itself up by becoming a ‘general’ specialism, with its own training protocols, examinations, and (in Britain) a Royal College, has not been lost on commentators. The fact remains that it was adapting to the demands of the times.

p. 122The second development was the emphasis on primary care in developing countries. International medical aid from the time of the League of Nations, formed after World War I, to the World Health Organization (WHO) and related international agencies created after World War II, had emphasized vertical, technologically driven programmes. Malaria, smallpox, schistosomiasis, hookworm, onchocerciasis (river blindness), and other specific diseases had been singled out for attention. The smallpox campaign succeeded completely, and other programmes had some significant successes, but that for malaria failed, spectacularly.

At an international conference of WHO held at Alma Ata, Kazakhstan, in 1978, the emphasis officially shifted to horizontal programmes, that is, primary care, education, and basic infrastructure, instead of specific vertical programmes aimed at individual diseases. Vertical programmes have not been completely abandoned, but the shift recognized the importance of the general over the specific, in terms of sustainability and efficiency. It prioritized individual health practitioners educating, diagnosing, and treating individual patients and their families.

Hippocrates is a sufficiently secure icon that anyone can identify with him with impunity. Nevertheless, many of the values of bedside medicine in the Hippocratic corpus have re-entered the mainstream.

Library medicine: what price information?

The coming of books in the 15th century transformed medical knowledge. Two centuries later, medical and scientific journals changed the timescale. Books might be rushed into print to communicate an exciting new discovery or theory, but they might just as well be the careful product of a lifetime’s reflections. Journals, with their regular production schedule, were designed to be up-to-date. The early journals were mostly the productions of p. 123the scientific societies of the 17th century. Doctors and medical topics were well represented, and from the next century special medical journals began to appear. By the 1800s, the beginnings of an exponential growth had occurred, although since it was from a low base, it represented fewer new titles each year than we have become accustomed to. Weekly journals, such as those now called The New England Journal of Medicine (1812) and Lancet (1823), both still influential voices within medicine, allowed even speedier publication and also encouraged leaders, news items, and correspondence, all important in the formation of the modern medical profession.

The deaths of the book and the printed journal have been regularly forecast during the past couple of decades, when the computer, internet, and electronic publishing have transformed the way knowledge is disseminated. Neither has happened, and both books and journals appear at an increasing rate. The economics of publishing mean that ultimate change will undoubtedly be gradual. Nevertheless, ‘library medicine’ now lives like the rest of us in the computer age, and it has had at least two significant impacts on medical care.

First, the relationship between patients and their doctors has been changed by the fact that individuals now have easy access to medical information. Patients curious about the implications of a diagnosis or treatment could always ask their doctors, or take themselves to a library. The internet has made this easier, and has encouraged patients to be more involved in their own medical care. This phenomenon has merely accentuated a welcome process that has been underway for a generation or more. It requires medical personnel to be more communicative and communication skills are now taught (with varying degrees of success) in medical schools. It also creates problems, since the unregulated nature of the internet means that patients may receive partial, biased, or simply wrong information. Modern concerns with patients’ rights and the ease of access to information have shifted the balance of p. 124power between doctors and many of their patients. For the most part, this is a healthy situation, and requires doctors to spend more time with their patients.

Second, patient records have been fundamentally transformed by the new information revolution. There are major issues of access and confidentiality, and any national scheme, such as the one being attempted in the UK, is extremely expensive and so far unsuccessful. The hope that each patient would have his or her own medical record on a chip is good in theory: it would make life for health personnel in accident and emergency rooms much easier, and provide doctors with the information they need wherever the patient happens to be. In the short term, at least, the scheme would work mostly for those patients who are sufficiently concerned with their health to cooperate. Access to these data by insurance companies and employers is still an unresolved issue, and the utopian ideal is likely to remain fraught.

As librarians become information officers, and doctors stare at their computer screens instead of engaging with their patients, the troubled patient may be forgiven for thinking that the brave new world is not necessarily for the best.

Hospital medicine: what price care?

Hospitals have been central to medicine since the transformation in medical thinking and education that accompanied the French Revolution. They have of course evolved during the past two centuries, in their architectural forms, organization, funding, and medical and surgical functions.

Hospital architecture has become a special subject in its own right, as social, economic, and medical demands have changed. Many hospitals in the early-modern period deliberately reflected their religious origins and aspirations. They were often built, like cathedrals, in a cruciform shape, with altars and, inevitably, a p. 125chapel. In many parts of Europe, Roman Catholicism provided both the architectural inspiration and the nursing orders which provided daily care. In Protestant Europe, more secular forms developed, and many purpose-built hospitals in Enlightenment Britain bore more than a passing resemblance to the country house. The smaller specialist hospitals, dealing with such issues as childbirth, venereal disease, smallpox, diseases of children or of the lungs or eyes, were often started in an ordinary house, taken over for the purpose. Successful hospitals would move to larger premises, sometimes simply a larger house, but increasingly into a purpose-built structure. The specific demands were not very different from those of a house: a kitchen, privies or other facilities for waste disposal, rooms for beds, and, generally, quarters for a doctor. Surgery or childbirth generally took place in the patient’s ordinary bed, and sometimes this would be shared with other patients.

During the 19th century, specific medical and surgical requirements began to determine some aspects of hospital design. Pavilion wards, rectangular in shape with tall windows on both sides, had been a feature of military hospitals, and the Nightingale movement within nursing made this style of ward standard for large general hospitals. The pavilion ward had two desirable qualities: the double rows of windows made ventilation easy, in an age when miasmatic theories of disease predominated (Florence Nightingale was an ardent miasmatist and sanitarian); and the shape made nurse surveillance easy. When the Johns Hopkins Hospital was being constructed from the late 1880s, it incorporated the pavilion ward.

By then, however, there were other requirements. German university hospitals had emphasized the need for a small laboratory attached to each ward, where medical staff could perform chemical and microscopical analyses of urine, blood, and other substances. In most hospitals, the acceptance of antiseptic, and then aseptic, surgery led to special operating theatres, with p. 126appropriate sterilizing equipment. Germ theory meant that advanced hospitals needed special laboratories for cultivating sputum, blood, urine, and faeces, and cell pathology meant that tissue specimens were examined for cancer and other disorders. Biopsies taken during surgery were often read by the resident pathologist, and the nature of the operation would depend on his reading. From the end of the 19th century, X-ray equipment began to appear in hospitals, requiring space and technicians to take X-ray images and someone to interpret them. Outpatient departments also became important features of hospitals from the 1870s.

Each of these, and many more, medical and surgical innovations required adaptation of existing architectural arrangements or special consideration as new hospitals continued to be built. One should not push the analogy too closely, but there are resonances between 19th-century lunatic asylums and prisons, and between 20th-century hospitals and hotels. Both prisons and Victorian asylums were frequently built outside of cities, with surrounding walls and an emphasis on security and isolation. Hotel design and management structures have influenced modern hospitals: both have to provide food and clean linen for residents staying for variable lengths of time, and need laundry facilities as well as wholesale suppliers of food for preparation. Long central corridors with rooms coming off each side were another common feature, to say nothing of getting check-in procedures correct, including, in the United States and private hospitals everywhere, sorting out payment details.

The organizational side of hospital management has increasingly adopted business models. Early in the 20th century, American hospital administrators deliberately looked to modes of industrial production to inspire their drive for greater efficiency. Through-put, cost-cutting, and offering the client decent value for money made sense to administrators concerned with running their institutions at a profit. In Europe, most hospitals were still p. 127charitable institutions, but the same values could easily permeate, since budgets were invariably tight, and the main feature of all hospitals during the past century and a half is that of spiralling costs. In the clash between medical and economic values, the latter often dominate, no matter what the ultimate source of funding.

Costs are thus a central feature of the modern hospital, and a variety of ways have been adopted to meet them. When hospitals were largely run by religious organizations or private charity (the voluntary hospital was the principal mode of funding hospitals in Britain until they were nationalized in the context of the NHS), budgets were usually the responsibility of those who funded them, but rarely used them. Modern surgery, X-rays, and other diagnostic features meant that, from the late 19th century, the rich also had occasion to enter hospital. The British voluntary hospital solution was to build paying wards for the well-to-do, the profits of which subsidized the charitable wards. In the United States, paying wards developed earlier, and private hospitals, such as the Mayo Clinic, developed in Minnesota by the Mayo clan from the 1880s, offered advanced medical and surgical care, to those who could pay or who had private insurance. The role of insurance companies in the early 20th century is still insufficiently appreciated in medical history, and although many of the early companies emphasized their philanthropic aims, the profit motive was ever present.

Whatever the system of medical care, in Western societies, third-party arrangements are the norm in hospital payments, so large are the bills. The costs of building, heating, lighting, maintaining, equipping, and staffing these complex institutions have been an increasing concern for the past century. The guaranteeing body has been variously the state, the municipality, a religious organization, an insurance company, a charitable group, individual governors, a rich benefactor, or a combination of these. For-profit hospitals, such as those in the United States, attract much criticism, for the draconian admission policies, in which the p. 128insurance policy is more important than the diagnosis or medical need. But the drive for efficiency, and the adoption of business models, characterizes almost all modern hospitals. In the 19th century, fear of the income loss that chronic illness brought was the primary worry of working people. A debilitating illness requiring lengthy hospitalization and not adequately covered by insurance is now the fear of people who are comfortable as long as they have health.

New technologies as well as financial constraints have reduced the average length of hospital stays. Getting people out of bed quickly, even after major surgery, is now a surgical goal. There is sound medical evidence that this is a good idea, as it reduces thrombosis, bed sores, and muscle wasting, but the strategy also has economic rationale, since it reduces hospital stays. Diagnostic procedures that in an earlier age would have meant a stay in hospital are now conducted in the outpatients department.

Despite the problems, hospitals are here to stay. They have three particular features that make them indispensable: sophisticated diagnosis, acute care, and surgery. Diagnosis was the one thing that hospitals in early 19th-century France were best at, and, for different reasons, going into hospital for a battery of tests is still a common modern experience. Technology and science come together in such procedures as cardiac catheterization, to evaluate heart function; liver or kidney biopsy, to procure a piece of tissue for microscopic examination; the use of ultrasound to monitor foetal development during gestation; or the CAT scan, the computerized axial tomography, or MRI, the magnetic resonance imaging machine, two non-invasive means of visualizing structures within the body. The CAT scan and MRI use different technological and scientific principles, the former builds up a picture of the interior of the body through serial images that are combined with the use of the computer; the latter uses a strong magnetic field that is manipulated by a radiofrequency wave.p. 129

23. X-rays quickly found their uses in both diagnosis and therapy. In this image of X-ray therapy, from 1902, the apparatus has a shield around it, an unusual precaution at that time. The doctor himself is unprotected, without even a white coat as a badge of office

The two techniques have many similarities. Each innovation has been rewarded with a Nobel Prize for its developers; each produces a three-dimensional image which also shows soft tissues much more distinctly than traditional X-rays; each has dramatically furthered diagnoses and therapy, allowing, for example, needle biopsies that would previously have required invasive surgery; and each machine has been extremely expensive to build, maintain, and use. Since the MRI has fewer patient risks, and produces a clearer image of subtle soft tissue structures, it has largely replaced the CAT scan, but each in turn from the 1980s symbolized the power and costs of modern technology-driven medicine. Along with lasers, fibre-optics, and a host of other p. 130modern innovations, they have changed the face of hospital medicine, increasing what doctors can know and do, but also adding substantially to the costs of medical care.

The second feature of hospital medicine that will remain is acute care. Trauma, for instance, is not simply an important branch of military medicine, but also one that must deal with traffic accidents, knife and gun wounds, burns, and the myriad risks that modern society throws up. Terrorism has added to the visibility of the specialty. At the beginning of World War II, European countries made routine preparation for how to deal with a large number of civilian casualties; similar plans are now in place for large-scale disasters, but individual victims of accidents and acute illnesses were always part of the responsibility of hospitals.

Special places within hospitals were gradually developed to care for those acutely ill or injured. After Listerian antisepsis and asepsis made major surgery feasible, recovery rooms were added to operating theatres, and nurses who specialized in caring for surgical patients were added to hospital personnel. In the 20th century, blood pressure and other vital signs could be monitored, and with the development of intravenous fluids, and during the interwar years blood transfusion, surgical shock and other post-operative complications were dealt with more effectively. In the 1950s, continuous monitoring of the heart-beat was added to the technological equipment present there, and as heart attacks became commonly recognized as a medical emergency, coronary care units evolved to care for the acute stage. Such units are far from peaceful places for patients (or staff ), and during the 1970s, it was seriously debated whether heart attack victims were better off at home, simply resting. Better control of irregularities of the heart-beat, a major cause of death in the acute phase of myocardial infarctions, as well as modern resuscitation techniques, has guaranteed the permanence of coronary care units, despite their costs and inhuman environment. Patients who p. 131have experienced strokes, diabetic coma, or other debilitating episodes are also treated in such intensive care units.

Modern surgery is also inextricably sited within the hospital. Minimally invasive techniques mean that radiologists, cardiologists, gastroenterologists, and other non-surgical specialists often perform manual procedures, but the surgeon still occupies a privileged place in the modern medical hierarchy. If Nobel Prizes are any measure of medical worth, surgeons have been under-represented, especially in more recent times. Early on, Theodor Kocher (1841–1917) won one for his work on the surgery of the thyroid, and Alexis Carrel (1873–1944), who pioneered vascular suturing, got one, although it was mostly for his research with tissue cultures. Charles Huggins (1901–97), a Canadian-born urologist, shared a Nobel Prize (1966) for showing that tumours of the prostate can be dependent on hormones. His work had been done a quarter of a century previously. The Portuguese neurologist Antonio Egas Moniz (1874–1955) shared the 1949 Prize for his work on pre-frontal lobotomy, now something of an embarrassment. In terms of helping humanity, John Charnley (1911–82), the British orthopaedic surgeon, deserved but did not receive one for his pioneering research on the technology and surgical approaches to hip replacement. Cardiac catheterization also collected one (1956), but none of the recipients was a dedicated career surgeon, reinforcing the point that surgical procedures are now performed by a variety of non-surgical specialists.

The only modern surgical Prize went to three pioneers of transplant surgery, one of the most dramatic aspects of present-day surgery, but one that has involved much basic immunological research, to control the tendency of the body to reject tissues and organs perceived as ‘foreign’. Kidneys, hearts, and livers are now routinely transplanted from donors (generally dead, although a person with two healthy kidneys can spare one). Transplant surgery can accurately be described as a miracle of p. 132science and surgery, but it is also iconic for the dilemmas of modern healthcare. Receiving a foreign organ generally puts the recipient in a life-long medical relationship with his or her carers, since powerful immunosuppressant drugs must be taken on a long-term basis and they have unfortunate side effects, including increasing the donor’s susceptibility to infections. More ominously, the shortage of organs for transplantation has led to an international black market, primarily through desperately poor individuals from developing countries selling their organs for use in the richer countries.

Hospitals save lives. They are also still at the centre of medical education and clinical research, but they suffer from serious structural problems. Funding is almost always an issue, and although they frequently retain the rhetoric of charity and service, they must be run like the complex institutions that they are. Antibiotic resistance among many pathogenic micro-organisms is common today, but the antibiotic-rich environment of hospitals makes them ideal places for this evolutionary phenomenon to occur. Resistance to antibiotics happens when a random genetic change in a micro-organism produces some characteristic that enables it to resist the antibiotic. In ways that Darwin would have understood, the new hereditary characteristic gives the micro-organism an advantage, and it thrives. The staphylococcus, a common bacterium which causes boils but also more serious infections, was initially susceptible to penicillin, the wonder drug of the 1940s. It soon became resistant, and as other antibiotics were developed, it acquired resistance to many of those too. We now know it by its acronym, MRSA (Meticillin Resistant Staphylococcus Aureus). It is a serious problem in hospitals and, since there is always movement between the hospital and the wider world, in the community as well. The causative agents of malaria, tuberculosis, and HIV have all developed resistance to many of their conventional treatments, complicating these major world diseases.

p. 133The hospital has not ‘caused’ this phenomenon; human agency has. But drug-resistant pathogens are now so common that modern hospitals sometimes lose their desired epithet, as ‘houses of healing’, and revert to that old one, ‘gateways to death’.

Medicine in the community: our health in our hands

The 19th-century advocates of public health created an infrastructure throughout the Western world, developed at different speeds and sensitive to differing national ideologies. As we have seen, the movement achieved more effectiveness after the causation of infectious diseases was better understood, but the infrastructure itself was just as important. The band of individuals (MoHs; water and food analysts; sanitary, factory, and building inspectors; visiting nurses), and the ever-growing set of regulations they were empowered to enforce, were necessary to achieve the reforms that governments increasingly identified as their responsibilities. Public health was supposed to live up to its name, and extend its benefits to all members of society.

On the whole, it did, but vulnerable groups – the poor, children, the aged, and women of child-bearing age – were often targeted and stood to benefit most. While this may put an unnecessarily benevolent gloss on a good deal of late 19th- and early 20th-century public health activity, one historian has argued that war is good for babies and other young children. The war in question was the Boer War, with its disquiet that so many recruits from the slums of Britain had to be rejected from army service on health grounds, and the unsatisfactory outcome of the conflict led to fears that the British could not sustain their Empire without improving the health and fitness of their people. Similar fears fuelled the public health and pronatalist movements in other Western countries, even if the spectre of racial p. 134p. 135degeneration (and a perceived birth-rate larger in the proletariat than in the solid middle classes) also stimulated the eugenics movement. Public health had traditionally been environmentalist in its orientation: get rid of dirt, overcrowding, and the slovenly morals that they engendered, and the populace would be p. 136healthier. This older mantra was diluted by the emphasis on bad heredity, and the newer scenario that only by stopping undesirables from breeding could Western nations continue their world dominance.

24. X-raying the massesas part of the campaign against tuberculosis was a regular feature of public health initiatives from the 1930s. This Glasgow tram from 1957, invoking a fair-ground ride, tries to make having an X-ray trendy as well as modest (no undressing, but fast and confidential)

25. Contaminated milk was a common source of tuberculosis spread before pasteurization became mandatory. Other potential hazards are noticed here in this 1929 lantern slide, encouraging the public to get involved by reporting to the MoH and complaining to the milkman

As is well known, the eugenics movement reached its apogee in Nazi Germany. Their notions of racial destiny, and the inherent degeneracy of Jews, Gypsies, and other marginal groups, were barbaric in the extreme. The whole Nazi ideology was driven by a ruthless dogmatism, but it ironically included notions of the importance of fresh air and exercise in maintaining health, and a belief that tobacco and alcohol were inimical to it. There are many routes to current ideas of a healthy lifestyle, and not all of them worth emulating.

The Nazis took ideas of racial hierarchies to the extreme, but racism was widespread in the period. While developed nations can take the surveillance and regulations of public health for granted, or be incensed when they fail, many of the trappings of the older sanitarian movement are still being played out in the developing world. Much has changed, of course, but the problems encountered in poorer parts of the world would not have surprised Edwin Chadwick or other advocates in 19th-century Europe. Issues of child and maternal mortality, epidemic diseases, poverty, and poor sanitation are still with us. While the West combats obesity and sedentary lifestyles, much of the world scrabbles for enough to eat. Old-fashioned public health is still being fought for in many countries. Chadwick thought that clean water and decent arrangements for disposing of human waste would solve most of the problems of filth disease. His medical ideas were naïve, but his admirable aims have yet to be achieved worldwide.

Imperial powers did some work on public health in their possessions overseas. The British in India, for example, took cholera and malaria very seriously indeed. Neither was a uniquely ‘tropical’ disease, since both were known in Europe. But the p. 137discovery by Ronald Ross (1857–1932), working in the Indian Medical Service, of the role of the Anopheles mosquito in the transmission of malaria catalysed the development of tropical medicine as a medical specialty. Malaria occurred in temperate climates as well as tropical ones, but in many ways it fitted the model that Ross’s mentor, Patrick Manson (1844–1922), elaborated as the distinct features of the diseases that the specialty had to deal with. It was transmitted by an insect, so had a more complicated life cycle and mode of spread than the bacterial diseases of the Old World. Furthermore, its causative organism was a plasmodium, not a bacterium, filling Manson’s belief that worms, parasites, and other kinds of organisms were the main enemies in the tropics. Manson used Ross’s work, announced in 1897 and 1898, to convince the British government to found a School of Tropical Medicine in London, in 1898. Another one in Liverpool was established a few months earlier, and a spate of institutes and schools of tropical medicine were in existence throughout the world before the outbreak of World War I.

The aim of these schools was to train medical officers to deal with the range of diseases that would confront them in Asia, Africa, and other tropical areas of the world. Tropical medicine was to make these areas safe for Europeans, to carry out their effort to Christianize, civilize, and commercialize the peoples under their dominion. Some historians have dismissed the effort as completely self-serving, carried out by governments and individuals who had no feelings for the ‘natives’, and who in any case merely wanted to create safe enclaves for European soldiers, merchants, planters, and civil servants. If one examines dispassionately the motives and careers of many of the key individuals involved in the effort, a much more subtle scenario is reached. At the very least, enlightened self-interest dictated that diseases needed to be controlled among all groups. In Asia, in particular, Europeans often appreciated the richness of the cultures they were controlling and exploiting. In sub-Saharan Africa, a different set of conditions obtained, accentuated by the harshness of the disease profile in p. 138Western Africa, in particular, and the absence of a written culture. But it is historically distorting to write off medical and public health efforts in Imperial dominions as simply exploitative.

Most ‘tropical medicine’ before World War I was initiated by colonial powers, to serve their own possessions. The exception was missionary medicine, nurses and doctors who were concerned with spreading the message of Western health values as well as religion. Missionaries were responsible for setting up and manning health centres and hospitals in many parts of the world, and while they tended to follow established Imperial geography, there was some missionary activity outside of home-country spheres of domination. An embryonic international health movement started with the formation of the League of Nations after World War I, although much of its health-related activity was concerned with Eastern Europe and other parts of the war-torn continent. Although the United States government was reluctant to support the League, the Rockefeller Foundation and its international agencies were particularly active during the interwar years. Rockefeller officials were keen to establish Western-style institutions (medical schools, research institutions, and teaching hospitals) in areas where there was the possibility of continued indigenous support and, therefore, continuity. Europe, Mexico, and Latin America were the Foundation’s primary areas of international activity, although its interest in malaria, schistosomiasis, and hookworm took Rockefeller officials to other parts of the world too.

Following the end of World War II, internationalism was finally established through the United Nations and sister organizations, especially WHO. WHO has always had admirable goals, but has struggled with the complexity of the problems it sought to confront. The dominant mode of attacking disease in the interwar years was vertical: single diseases with specific modes of transmission were singled out as the most efficient way of improving health in poor countries. Smallpox and malaria were p. 139the subjects of two major WHO campaigns in the 1950s and beyond. The malaria programme, approved at the 1955 General Assembly of WHO, was largely inspired by the availability of DDT, the insecticide that was developed during World War II and used with great effectiveness against malaria and typhus (a louse-borne disease) in the war zones.

Ever since Ross and G. B. Grassi (1854–1925) in Italy had discovered the role of the Anopheles mosquito in the transmission of malaria, and elucidated the life cycle of the plasmodium responsible for the disease, its control seemed straightforward. Eliminate the mosquito, through interrupting its breeding sites by draining, oiling, and employing ‘mosquito brigades’ to patrol the offending sites, and the disease ought to disappear. Besides, quinine could cure the disease and had long been shown to protect if taken regularly. Ross spent the last three decades of his life arguing that malaria could be prevented, if sufficient resources were devoted to it. The knowledge was there, only a lack of will (and money) prevented this desirable goal from being achieved.

For Ross, apply the vertical programme, eradicate or marginalize the disease, and a healthier workforce would achieve economic development impossible as long as the disease raged. For other malariologists, only a horizontal programme would work. The decline of malaria in Europe suggested that if a reasonable standard of living, economic development, and education were in place, malaria would fade out as a consequence. These malariologists argued that in highly malarious areas (much of Africa, for instance), the constant exposure from birth produced a population that was more or less immune. Remove this ‘natural’ exposure, and highly epidemic forms of the disease would thrive.

DDT seemed to consign these arguments to history. It was cheap, had a residual effect after spraying, and promised a technological fix to a complicated and widespread medical problem. Parts of worst-affected Africa were excluded from the mandate, but the p. 140p. 141plan was that the rest of the world would be malaria-free in a couple of decades. The campaign was approved in a fit of post-war optimism, but it was bedevilled by problems from the start. Spraying equipment would be delivered and there would be no DDT, or vice versa. Training field-workers was slow and laborious. The results in different parts of the world were variable. A growing environmental movement, spearheaded by the publication of Rachel Carson’s Silent Spring (1962), objected to the more general effects that DDT had, and the 1960s protest movement disliked the large-scale organization of the campaign and, especially, the profits that (mostly) American firms were making from it. Finally, DDT-resistant mosquitoes began to emerge.

26. Preventative medicine played an important part in the campaigns of World War II. Here, soldiers are encouraged to take their regular doses of atebrin, the most commonly used antimalarial drug of the period. Malaria was still an important disease in the Middle East, southern Europe, and the Asian theatres of war

The malaria eradication programme was quietly converted to a focus on control in 1969, with much less fanfare than its launch. Its mistakes have since been easy targets for critical analysis, but it had achieved some successes, for instance in the Mediterranean countries of Europe, where malaria had resurged during the disruptions of World War II. Italy, Spain, Portugal, and, notably, Greece, far less developed economically than the others, were declared malaria-free during the years of the campaign. Sri Lanka came close, and the incidence of the disease in India decreased dramatically.

By contrast, the WHO smallpox eradication initiative is still heralded as a triumph of modern medicine. A triumph it was, since the last naturally occurring case of smallpox was reported in 1977, and the disease was ratified as extinct in human populations in May 1980. It was in the end the product of international cooperation and good will, not of medical science. It relied on the old (folk) discovery of vaccination, and the time-honoured methods of case tracking, isolation, and mass vaccination of populations at risk. There was no treatment save supportive measures. Smallpox could be eradicated since it had no natural animal reservoir, it was passed person to person, and could be controlled through isolation and vaccination. It was an p. 142administrative campaign, although that in no way diminishes its importance.

Vertical, single-disease campaigns are still attractive, and several have been successful. Polio is almost eradicated, and guinea worm and onchocerciasis have been counted as effective. Despite the glamour (even if the work may be routine) of single-disease strategies, the importance of primary care has also been recognized. The WHO Alma Ata conference officially mandated horizontal programmes as a necessary goal of international healthcare. In essence, this merely ratified the truism that a medical and social infrastructure is a precondition for sustainable delivery of modern public health and healthcare. Its realization has been slow, as the economic difference between the rich and the poor has increased in the past few decades, and HIV, drug-resistant malaria and tuberculosis, and wars have intervened. There have been some gains, but more setbacks, during the closing decades of the last century, and the outlook is challenging to say the least.

Some of these problems in poorer countries are simply reflections of issues in the West, where alcoholism, drug-use, resistant tuberculosis and HIV, and obesity have become major health matters. One social habit, exported from the West, threatens to be a time bomb in the coming decades: cigarette smoking. The discovery of the direct link between cigarettes and lung cancer is one of the great achievements of modern epidemiological surveillance. Lung cancer was a rare condition in earlier centuries, and its gradual increase during the interwar years was noted by many clinicians and a few statisticians. By the late 1940s, it was recognized as a serious disease of modernity, and the Medical Research Council (MRC) in Britain commissioned two individuals, a mathematically inclined clinician and a statistician, to investigate its spread, and try to determine its cause. The clinician was Richard Doll (1912–2005); the statistician, Austin Bradford Hill (1897–1991). Their own working hunches suggested p. 143that lung cancer was probably a disease of modern pollution, car exhaust fumes, or tar from road surfaces.

They began work by devising a questionnaire for patients in London hospitals diagnosed with cancer of the lung, liver, or bowel. The initial striking result was that heavy smoking was present in those with lung cancer, but not in those with the other forms of cancer. At the same time, an American study (1950), based on autopsies of patients dying of lung cancer, also found a high prevalence of smoking in the victims. Based on these suggestive findings, Doll and Hill devised a prospective study, following the health fortunes of more than 34,000 British doctors who agreed to take part in it. Because doctors must give their address changes each year to the Medical Register, an annual list of qualified medical practitioners, Doll and Hill were able to follow their cohort over the years, relating the individual’s chances of acquiring lung cancer to his or her smoking habits. Since many doctors (including Doll himself ) gave up the habit once the risks were exposed, the study also offered the opportunity to compute statistically the years gained by giving up the sot-weed. The final part of the study was published in 2004, 50 years later, and was written by Doll himself, with a colleague. It is probably the most remarkable ‘social’ experiment ever devised within medicine. It was simple in design but dogged in execution, and the results unfolded in a series of papers over half a century. By the time the ‘experiment’ ended, much other evidence had been produced on the health consequences of cigarette smoking, but Doll and Hill can be said to have initiated the modern movement of ‘lifestyle medicine’.

The phrase is barely two decades old, but it seems here to stay. Community medicine involves surveillance, and putting the observations together has come up with a picture in which the ordinary individual has a major input on his or her health. Our choices influence our well-being. In the golden age of medicine, from the 1940s to the early 1970s, there was every confidence that, p. 144p. 145whatever we did, doctors could take care of us. Between surgery, antibiotics, tranquillizers, hormones, contraceptives (medicine influencing lifestyle rather than lifestyle medicine), and the range of other drugs and therapies, the promise of an age of health seemed just around the corner. Although medicine is now even more powerful, we are less confident about it. Alcoholism, smoking, drug abuse, venereal disease, obesity, fatty, high-salt takeaways, factory farming, and other dimensions of modern Western living have taken their toll. Many of these indiscretions are old, although some are new. The doctor–patient relationship has changed, and the coming of patient power has brought with it recognition of patient responsibility.

27. Lifestyle medicine from 1992, in a poster aimed both at countering obesity and the deleterious effects of excessive alcohol consumption

The Hippocratic emphasis on moderation reminds us that doctors have long been moral policemen. What counts as moral, and what immoral, has a tendency to change in different cultural settings. In the early-modern period, a syphilitic lesion could be a kind of badge of honour among some social groups; in the interwar period, good eating meant lots of red meat, cream, and eggs; cigarette smoking was an emblem of female emancipation. Societies change, and so does medical advice. There are good reasons to think that advice nowis better than it sometimes was in the past, and even those who distrust doctors and medical science still enjoy the benefits of the surveillance and epidemiological studies that try to tease out the harmful from the beneficial. When in doubt, remember the Hippocratic injunction that health is most likely to be found in the middle way.

Laboratory medicine: still the promise of the new

The modern biomedical laboratory has never been so remote, and yet so close, to the aware, average citizen. Scientists frequently call news conferences when they think they have something important to report; all news agencies carry medical science items on a regular basis. The internet makes sophisticated knowledge available to anyone who wants to take the trouble. Despite our p. 146modern information-driven culture, surveys reveal that profound ignorance about health and science is widespread and worrisome. It has probably always been this way, and the physicist and novelist C. P. Snow’s critique of the ‘two cultures’ had resonance before he articulated it in 1959, and still does. Snow argued that most non-scientists are less informed about the main ideas of science than scientists are about those of general culture. Ignorance is everywhere, but ignorance of science and medicine particularly so.

If the details elude them, most people know that the medicine that is practised in the 21st century has been heavily influenced by medical science. Above all, modern drug discoveries, and, more recently, the controversies surrounding the Human Genome Project and stem cell research, have been newsworthy. The latter two are beyond the scope of this historical account, but contemporary medicine has been transformed by the therapeutic power of drugs. Serendipity has played a part in the discovery of a number of them, but the laboratory has been the primary site where their therapeutic potential has been first observed. Claude Bernard’s comment of the 19th century is still true: the laboratory is the sanctuary of experimental medicine.

From the late 19th century, a number of effective pharmaceutical agents began to filter through, which have had staying power. These include aspirin, phenacetin, choral hydrate, and the barbiturates. They all share the characteristic of being relatively simple chemically, amenable to the analytic methods then available. Aspirin is often mentioned as a drug that would not pass modern safety standards, given that it is a gastric irritant and can be used for suicide. Ironically, in low doses, it has been shown to be effective in preventing blood clotting, and so is used to prevent heart attacks and strokes, uses remote from what it was originally introduced for. The effect is small in the individual but significant in a large population. Its mechanism of action has been worked p. 147out only within the last generation, decades after its use was routine, as an anti-inflammatory drug and to relieve pain and fever.

Between this group of drugs and the 1920s came several chemicals and a number of biologicals, especially vaccines and antisera. None of them could compare with insulin, discovered in 1921 by a young physician turned physiologist and a medical student at the University of Toronto. Frederick Banting (1891–1941), the physiologist, obtained the use of the laboratory during the summer holidays, while the professor was on holiday. Charles Best (1899–1978), the medical student who subsequently became a distinguished physiologist himself, helped in the careful isolation of the active hormone secreted by the pancreas. Amazingly, the substance reduced the blood sugar levels of diabetics, and Banting and the absent professor, J. J. R. Macleod (1876–1935), shared the Nobel Prize almost immediately. Banting and Macleod appropriately shared their Prize moneys with Best and the chemist, J. B. Collip (1892–1965), who had helped with the purification of the substance. This was a classic one-off experiment, widespread in its therapeutic implications and fully deserving of the Prize that was quickly awarded. Within a year, commercial insulin was available, and for diabetics the new drug could be life-saving. Insulin is paradigmatic of both experimental medicine and modern medical care. Insulin controlled diabetes, it did not ‘cure’ it, and its victims were still left with a permanent affliction that needed daily management. Despite better ways of administering the drug and different preparations, insulin-dependent diabetes is a life-long problem with many complications which also need to be managed as they occur. Time and again, modern hopes of cure have really been the sentence of chronic care, better than the alternative, but less than early expectation. The brutal truth is that the human body is a wonderfully evolved machine, and medicine rarely does as well as nature.

p. 148Despite the ongoing issues relating to diabetes control, insulin was a major innovation, and seen as such by patients. It encouraged the general public to expect more from laboratory investigations, an attitude reinforced by success in treating pernicious anaemia. The results were not so dramatic as those of patients in diabetic coma waking up with the administration of insulin and glucose, but pernicious anaemia, as the name suggests, was a debilitating, distressing, and ultimately fatal affliction. Like insulin, however, the rationale for the therapy was based within the laboratory, in feeding experiments with dogs. The solution, eating large quantities of raw liver, was not exactly what patients might have chosen, but most thought it was better than the consequences of their disease.

These and other laboratory innovations – blood typing making transfusions safe, various vaccines, increased understanding of the nature of viruses – kept scientific medicine in the public domain. The take-off occurred in the years surrounding World War II, producing ultimately the big science that we still have. The sulpha drugs, for instance, were effective against several common bacteria: one consequence was a rapid decline in women’s mortality from puerperal fever (the infection all too frequently following childbirth). These were developed just before the war (the Nazis refused to let their discoverer, Gerhard Domagk (1895–1964), go to Stockholm to collect his Nobel Prize), and the war itself put paid to the international patent system, so sulpha drugs could be manufactured outside of Germany. During the early years of the war, these drugs were much used; by its end, they had been overtaken by penicillin.

Penicillin is probably the wonder-drug of all time. Its story adds to the appeal, discovered in 1928 serendipitously by Alexander Fleming (1881–1955), through a mould on an uncovered Petri dish, but more or less neglected for a decade (there were a few isolated attempts to employ it therapeutically). With the outbreak of World War II, the Oxford professor of pathology Howard Florey p. 149(1898–1968) and his team were charged with looking for new therapeutic agents against bacterial infections. Penicillin was among the substances they chose, and using makeshift equipment in wartime conditions, they isolated enough of the precious mould to show that it was indeed dramatically effective. Their first patient, an Oxford policeman with a staphylococcus infection following a rose-thorn puncture wound, improved, but there was not enough penicillin to achieve a cure, despite recovering it from his urine and readministering it. He died.

During the war, Florey and a colleague went to the United States, where pharmaceutical manufacturing was less disrupted. Florey had old-fashioned beliefs about the openness of scientific research, so failed to pay attention to the patent arrangements. American pharmaceutical manufacturers were much shrewder, and by the last two years of the war were manufacturing large quantities, and making large sums of money. At first, reserved essentially for military use (it was effective against many bacterial infections, including syphilis and gonorrhoea, as well as some contaminants of war wounds and bacterial pneumonias), penicillin was in general civilian use shortly after the war ended, in 1945.

The penicillin story is a thoroughly modern one. Highly profitable, it needed industrial modes of production and distribution. It was very effective against many common scourges, became cheap, saved many lives, and greatly increased the prestige of the laboratory and of modern medicine more generally. It was a miracle drug, even if miracles don’t last forever. Penicillin was given indiscriminately, in doses that were not correct, for conditions that were not appropriate, and in courses that were not completed. It began to lose its effectiveness, as penicillin-resistant bacteria emerged. In the early days, this seemed only a minor problem, since other forms of penicillin were manufactured, and other antibiotics came on the market, including streptomycin, effective against tuberculosis, the age-old chronic bacterial killer. Streptomycin was developed in the United States, and when a p. 150small supply reached Britain just after the war, Austin Bradford Hill (soon to turn his attention to lung cancer) turned limited availability to good effect, designing a proper ‘double-blind’ controlled trial, in which neither the participating doctors nor the patients knew which therapy was being tested. In this way, the bias of expectation could be removed. The results demonstrated the therapeutic effectiveness of streptomycin. Hill’s experimental design has become the gold standard for evaluating new therapies.

Streptomycin, penicillin, and the other antibiotics ushered in a golden age, when new effective drugs and vaccines seemed to be the inevitable result of pharmaceutical and biomedical research. Cortisone appeared in the late 1940s, and was accompanied by films showing severely crippled victims of rheumatoid arthritis getting out of their beds and walking. New drugs promised to control those cancers that were not within the reach of increasingly sophisticated surgery or radiotherapy. Antipsychotics dramatically reduced the symptoms of schizophrenia, severe depression, and the other afflictions of patients who had spent their lives in psychiatric asylums. Victims of encephalitis lethargica, an epidemic of the 1920s, who had been in a coma for decades, woke up in the late 1950s after being administered dopamine, a drug recently introduced for Parkinson’s disease (the response was short-lived if dramatic). By the early 1960s, community psychiatry was the buzz word, as psychiatric patients were to be treated as outpatients, with the belief that they would be able to live more-or-less normal lives if they simply took their medicines. For people with mild depression or anxiety, Librium and Valium came on the market. Medicine seemed truly to have, or shortly to have, a pill for every ill.

Before the 1940s, most medical research in the United States was supported by private foundations and charities, of which the cancer, tuberculosis, and polio charities took centre-stage. Franklin D. Roosevelt’s own polio kept this disease in the news.

p. 151In epidemic form, it became the major crippler of young people, with an average of 40,000 cases per year between 1951 and 1955. As a viral disease, it was not susceptible to antibiotics, and the consequence in those who survived the disease was often life-long disability. Although more prevalent in the United States than any other country, polio had a worldwide distribution (higher in the West than in poorer countries), and the epidemic in Copenhagen in 1952 was poignant, not only for its severity but for the acts of humanity it inspired. In order to keep the severely afflicted alive, tracheotomies and intermittent positive ventilation were used, with some 1,500 volunteers spending 165,000 hours ventilating polio victims by hand. Polio did not conform to the rich/poor divide: it is a disease of decent hygiene, children in countries without clean water acquiring the virus in infancy when it does not produce the lasting neuromuscular damage caused when older children and young adults are first exposed.

The viral aetiology of polio, and the fact that people who recovered never got the disease again, made a vaccine the most sensible strategy. The March of Dimes Foundation was wealthy, although grant applications were evaluated by standards that would be unacceptable today. Several vaccines were prepared in the 1940s, but only with the Salk and Sabin vaccines of the 1950s were large-scale immunization campaigns put into practice. Jonas Salk (1914–95) developed a killed-virus vaccine. Despite some serious glitches, the vaccine was effective, but it was soon superseded by the attenuated live-virus vaccine of Albert Bruce Sabin (1906–93). Sabin’s was administered orally, on a lump of sugar, which made it easy to distribute and popular with children. It had the advantage that the attenuated virus was then excreted in the faeces, and provided natural protection by the identical route (oral-faecal) through which the disease spreads. Like smallpox, polio is a modern success story and the disease’s worldwide eradication has almost been achieved. The polio story is full of strong personalities, and no small amount of duplicitous behaviour, but the result was a desirable one.

p. 152Its success encouraged more medical research, and the vast industrial-scientific establishment we still have was created. The largest medical research organization in the world, the National Institutes of Health (NIH), in Bethesda, Maryland, was one beneficiary. From the 1950s, the American government began to be a major player in medical research, with ever larger laboratories and multi-authored scientific papers the norm. Whatever parameter one measures, basic medical research has increased dramatically over the past few decades. So have improvements in healthcare, at least in the West. Doctors in the early 21st century can diagnose and manage disease even better than they could in the 1970s. Asthma, cancer, peptic ulcer, cardiovascular disease, and many others are less likely to be sentences of invalidism and death than they were only a generation ago. The changing age profile means that chronic disease is more prominent, and the translation of medical research into clinical practice has meant that many of the gains of modern medicine relate to care, not cure. The promises of health improvements through sequencing the human genome or stem cell research are so far largely unrealized. As scientific capability rises, so do expectations, and many patients no longer have patience, having been promised so much.

Modern medicine: the reality of the new

It is perception as much as reality that dictates modern attitudes to medicine and what it can, and cannot, do. The thalidomide disaster was a turning point. It seemed an excellent drug in the late 1950s, a wonderful prevention of morning sickness in early pregnancy. It was hastily marketed and not adequately tested. A sharp-eyed official in the United States prevented its being released there, but thousands of women in more than 40 countries took the drug during pregnancy before the relationship between the drug and birth abnormalities in the limbs of their babies became clear. Although the episode ultimately did result in tightening up safety standards on new medicaments, it dented public confidence in the pharmaceutical industry. No subsequent p. 153p. 154drug has been quite so obviously deleterious, even if several have been hastily withdrawn after side effects have emerged. The modern pharmaceutical industry has been of a piece with other multinational corporations. Small firms get swallowed up in larger ones, and contemporary budgets for advertising and sales are larger than those for research and development. Direct advertising of prescription-only drugs in the United States has introduced a new, disturbing element in the industry, and ‘add-on’ medicines, where small changes are made to an existing drug, occupy too much of the industry’s time. Research tends to follow common disorders of the West, with lucrative potentials, instead of major diseases of the poorer countries, where there is great need but little chance of yielding vast profits. A long-term chronic disease, in which patients must take their medications for years, or even for the rest of their lives, is the ideal goal for a new drug.

28. The studious physician at the bedside: Sir William Osler, one of the most admired physicians of all time, does his stuff in diagnosing and thinking about what he has learned. Bedside manner with modern intent

HIV (AIDS) provides an object lesson on the status of modern market-driven healthcare. From its emergence in a particularly virulent form in the 1980s, largely among gay men and injecting drug users in the United States, it has become a symbol of the power and the problems of contemporary healthcare. Because it first manifested itself in a rich country, biomedical research was marshalled quickly, although some religious leaders insisted that the disease was simply God’s punishment for homosexuality and other forms of sin. President Ronald Reagan took his time uttering the acronym AIDS in public and the Catholic Church refuses to countenance the use of condoms as a means of preventing the spread of this sexually transmitted disease. AIDS still carries the heavy burden of stigma.

If those at risk thought the official response was muted, this should be compared with traditional Western lethargy about diseases of poor countries that pose no threat to the rich ones. A quarter of a century later, the lapse between the earliest cases of Kaposi’s sarcoma, then a rare form of cancer, and the appearance p. 155of compromised immune systems among previously healthy young adults, on the one hand, and the identification of the causative organism, in 1984, on the other, seems fairly short. That two groups, one in the United States and one in France, almost simultaneously identified the responsible retrovirus, and each claimed the spoils, is another sign of the times, when the big prizes in science are keenly contested.

HIV was initially known somewhat condescendingly as the disease of the 3 H’s – homosexuals, heroin-users, and Haitians. The poor in Haiti were identified as an early vulnerable group, but they were soon joined by the African poor, and it is in Africa and other developing countries that the starkest issues and the most serious social and economic consequences of AIDS are found. In the West, the disease has quickly changed from an acute to a chronic one, although one still with a serious mortality rate. Antiviral treatments, available since the 1990s, slow the progress of the disease, but they remain expensive and have side effects. Good nursing care and the timely treatment of infections as they occur are also important in increasing quality of life and decreasing morbidity and mortality. Like so many diseases caused by micro-organisms, however, problems of drug resistance have come to the fore, and the HIV-positive tag is a grim one.

In some parts of Africa, AIDS is a disease commonly transmitted by heterosexual intercourse, and the incidence of individuals who are HIV-positive, as well as those suffering from the full-blown syndrome, is overwhelming. Treatment is expensive and in any case requires a healthcare infrastructure that is simply missing in most of the continent. Along with malaria and tuberculosis, AIDS has dominated the international health scene for the past couple of decades. All three diseases have strains that resist conventional chemical treatment and their knock-on effects in terms of morbidity and mortality in young adults are huge. Disease has further increased the differential between the rich and the poor p. 156and, despite the substantial contribution of the Gates Foundation and other international agencies, promises to do so in the immediate future.

AIDS has been called a social disease for which its sufferers looked to medical science for a solution. Science and medical practice based on it are among the most significant achievements of Western culture. We need them, but medical science alone cannot solve the problems of human beings. We no longer live in a world where the idea of inevitable progress carries much conviction.