Clinical Progress

In all countries the inexorable development of clinical medicine to a large extent drives the demands upon health services. The five chapters of From Cradle to Grave (published 1998) deal with clinical developments between 1948-1998. This section aims to continue the coverage of new and rapidly developing topics. It is based on material in general medical publications. Specialists asked to send me additional, preferably concise, material for inclusion.

Paragraphs include

World wide there was increasing demands upon doctors and health care systems from the mounting numbers of patients with chronic disease, not only the degenerative ones such as hear disease, diabetes and chest disease, but a wide range of others including chronic uveitis, multiple sclerosis, depression and osteoporosis. Ideal drug treatment for such conditions became ever more complex as clinical trials of new drugs and regimes were published. Even where the evidence was clear, there were wide and unwarranted variations in clinical practice, even in the best academic centres.

National Service Frameworks.

For many years government had supported the professions in encouraging good clinical practice; the standard technique was to establish a professional sub-group under the auspices of the Standing Medical Advisory Committee, and then to commend its recommendations. In the seventies both the Conservatives and Labour had issued a range of proposals to improve clinical services, for example Better Services for the Mentally Handicapped (1971). This process was continued with the introduction of national strategies and National Service Frameworks which " set national standards and defined service models for a specific service or care group, put in place programmes to support implementation and established performance measures against which progress within an agreed timescale will be measured".

The first were for mental health (September 1999), coronary heart disease (March 2000) and the health of the elderly. In the seventies there were complaints that the total costs of all the good advice coming from government exceeded the funds available. Now there were similar complaints, not least that the amount of data that GPs and hospitals were expected to collect to demonstrate their performance was becoming beyond the capacity of the system to deliver.

There had been major improvements in health over the previous 50 years. Life expectancy is longer. Maternal mortality fell from 102 per 100,000 in 1948 to 6 per 100,000 in 1996. Deaths from infectious disease had fallen by almost 90%. The greatest scope for further reduction in fatalities lay in the circulatory diseases, cancer, and injury and poisoning. However some trends were upwards, for example deaths from liver cirrhosis in young people, probably related to alcohol consumption, in both men and women. Smoking rates had fallen over the past twenty years in most groups. The fall was less in the lower social classes than in the higher and health education had had little impact on working class women. Labour had pledged to end the advertising of tobacco, but this was delayed amid claims that financial interests had influenced the decision (the Eccleston affair and Formula 1 racing). A White Paper, Smoking Kills, set out a series of modest measures in the hope of re-establishing a downward trend.

On election in 1997, Labour appointed a minister with specific responsibility for public health, initially Tessa Jowell. Labour commissioned an inquiry into inequalities in health, conducted by Sir Donald Acheson, an unusual choice as he had been CMO -- at the time when The Health of the Nation was prepared under the previous administration. The inquiry team, composed of scientists with no economist, based their recommendations on published evidence. Because of the clear evidence that the poor generally lived shorter unhealthy lives, the Inquiry's key recommendations involved a wholesale redistribution of wealth. The difference between the mortality rates of Social classes IV and I for stroke, heart disease, accidents and suicide were, if anything, widening. Unlike the recommendations of the Black Report (1980), Acheson's 1998 wide ranging recommendations were not costed. Sir Donald wanted to see the package implemented as a whole. ';The inquiry had not looked at cost effectiveness', he said, ';affordability is not a matter for scientists but politicians…' Some recommendations were vague, for example the need to take ';measures to prevent suicide among young people' or ';policies to reduce fear of crime and violence'. Sir Donald had asked an evaluation group to look at the quality of the evidence it used to reach its conclusions, and support its recommendations. For most of these there proved to be no high quality controlled studies showing that the recommendations would improve health - there were few randomised-controlled trials available - but hard evidence of effectiveness has seldom underpinned changes in health policy. Indeed the widely held view that the extent of inequality of income in a society correlated with the health of the population was undermined by more recent studies, although at least in the US education was a powerful predictor of mortality, far more than income inequality (BMJ 2002;324: 1-2)

The 1992 Health of the Nation initiative in the field of health promotion was reviewed. The strategy, though widely welcomed, had failed to change spending priorities and had made no significant impact on health authorities, trusts or GPs. In 1999 Labour published a revised programme as a Green Paper, Our Healthier Nation, followed by a White Paper, Saving Lives. This reduced the number of health improvement targets to four and re-iterated the contributions to health both of social, economic and environmental factors, and the individual decisions taken by individuals and their families. The document expressed aspirations, and tended to say what government could do - and not what it would do. The role of the Health Visitors would be strengthened, and educational programmes would be introduced, for examine sessions at school to help children to avoid accidents. The new policy was not substantially different from the old one, save that the goal was now to improve the health of the worst off in particular. The Chief Medical Officer's report on the State of the Public Health (2001) pointed to the existence of the problem for more than a century, the north/south divide in mortality in lower social classes, the many programmes that government had set in motion, the even greater number of targets that had been set, and the exhortation to attack inequalities of health.

Public Health Physicians experienced major problems in their work, not least because continuous alterations in the areas of their organisations made local epidemiology difficult, and the areas seldom coincided with those of local authorities. Until 2001 most worked in Health Authorities - the "purchaser"or "commissioner" side. Some worked as epidemiologists in hospital trusts. Each Authority had a Director of Public Health sometimes with other roles such as Director of Health Strategy and usually with support from other consultants and trainees. The functions of public health included:

the evaluation and reporting on the health of their area's population, assessment of health services provided, planning services in relationship to need, and provision of community health services including systems of surveillance and the promotion the health of the local population

The investigation and control of infectious disease and illness due to infective or other toxic agents, and the identification of major health hazards in their area.

As regions had been abolished and regional outposts were integrated into the Department of Health, Regional Directors of Public Health became civil servants with all that that entails. Observatories were created was to report on the problems of the region by the analysis of statistical data. In parallel non-medical staff concerned with public health, e.g. health visitors, health educators and environmental officers, became eligible for membership of the Faculty of Public Health Medicine and the government decided that the post of Director of Public Health did not require a medical qualification.

The move to Primary Care Trusts, in April 2002, exacerbated the problems of public health as a discipline. Primary Care Trusts are essentially built upon general practitioner registered populations rather than a defined geographical area, and the large number of PCTs each of comparatively small size, posed problems for public health. Expectations of what public health should be doing also included activities, the scientific bases of which were uncertain. While on paper a main priority of government is the reduction in inequalities in health, in practice most interest lies in the clinical services and their organisation, leading to neglect of public health and the long term social and environmental issues which are effective in improving health status.

Immunisation

High levels of immunisation were being maintained. In 2000-01 about 94.5% of children had been immunised against diphtheria, tetanus and polio by their second birthday, and about 94% of 2 year olds had been immunised against Haemophilus influenzae b, 94% against pertussis and 87% against measles, mumps and rubella. Over the previous ten years meningitis as a result of meningococcal infection had steadily increased. More cases were associated with septicaemia, and more children and teenagers were dying. A newly emergent strain, meningococcal group C, was responsible for much of the increase and, in 1998/9, a new vaccine for this strain was introduced into the routine childhood programme and in 2002 for everybody under the age of 25. In 2001 there were 79 confirmed cases of meningitis C and three deaths compared with 551 cases and 47 deaths in 1999, before the vaccine was introduced. Routine infant immunisation against Haemophilus influenzae, introduced in 1992, also continued to prove successful. In 2001 an annual meningitis vaccination campaign to immunise travelers to the annual Hajj Muslim pilgrimage to Mecca, estimated at 50,000, dramatically cut the number cases from 45 cases in 2000 to just 6 with no deaths following the campaign in 2002. A specific vaccine incorporating the W135 strain prevalent in Saudi Arabia was used.

Measles, Mumps and Rubella (MMR) immunisation, however, although introduced some 15 years previously, ran into difficulties. Although the evidence was to the contrary, suggestions that the combined vaccine was linked in some cases to the later development of autism led to long standing concern about its safety, a fall in uptake and several small outbreaks. Other anxieties were raised about the mercury content of some vaccines. In 2002 a vaccine against chickenpox became available, but where it would fit in to the normal programme of vaccination was not decided.

Infectious Disease

Following 9/11, and the use of anthrax through the mail in 2001 in the USA (with many cases of skin infection and several deaths from pulmonary anthrax), the possibility of similar attacks in the UK, led the government to issue guidance to GPs and responsible bodies throughout the country. The NHS and the emergency services were primed to plan and organise for bioterrorist emergences. For anthrax, the brunt of the problem fell on the CDC in Atlanta, Georgia that was deeply involved in examining the outbreaks, and learning more about the epidemiology of a previously rare condition. The US government placed a major contract for ciprofloxacin an appropriate antibiotic. Concern that smallpox might be used by terrorists led by US and the UK government to order supplies of vaccine and to issue contingency plans for on the containment of local outbreaks, and mass vaccination.

The Communicable Disease Surveillance Centre (CDSC) continued to survey the hazards of infectious disease and European health ministers planned to strengthen and extend the EU's communicable diseases network so that it could be used in the event of bioterrorist attacks as well as the threat posed to health by world-wide mobility of people, food and products.

Citing the global threat of infectious diseases to health, prosperity and national security, the Chief Medical Officer of the Department of Health proposed in January 2002 a strategy he considered would improve the current system of preventing, investigation and controlling the threat of infectious diseases threats, and health protection more widely - Getting ahead of the curve. A new Health Protection Agency (HPA) would combine the existing functions of the Public Health Laboratory Service and three other national bodies (the National Radiological Protection Board, the Centre for Applied Microbiology and Research, and the National Focus for Chemical Incidents) to integrate the approach to protecting the public against infectious diseases as well as chemical and radiological hazards. Local PHLS laboratories would be taken over by the NHS, the agency having local health protection services, working with the NHS and local authorities, to prevent, investigate and control infectious diseases as well as chemical and radiological hazards. There would also be a national expert panel to assess the threat from new and emerging infectious diseases.

In August 2002 Sir William Stewart, formerly of the Microbiological Research Authority, was chosen to chair the Agency and the PHLS was ordered to transfer their laboratories to NHS trusts by April 2003. The PHLS board believed that shifting the majority of microbiological laboratories under the management of the NHS would break up a managed national system, leading to fragmentation; hospital laboratories were busy enough and might lack the commitment to public health functions such as the tracking of diseases and identification of their causes.

The existing national networks had long linked together number of surveillance institutes. The London based CDSC, for example, acts as the hub of the operation to track developments linked to legionellosis and salmonellosis infection with Escherichia coli O157, and the Institut de Veille Sanitaire in Paris is performing the same role for tuberculosis and HIV and AIDS.

DNA fingerprinting is a powerful tool for identifying people. Such advances were now applied fingerprint germs, to show their family relationships; to look at their evolution; to track the spread of germs and identify where they came from; or to fingerprint them to prove that a particular strain of the germ was the culprit of a particular outbreak. (Other techniques, e.g. phage typing, had been in use since the 1950s).

Tuberculosis remained a global health problem. The breakdown in health services, the spread of HIV infection, and the emergence of multidrug resistant tuberculosis in many parts of the world contributed to the worsening impact of the disease. Notifications in the UK for 2000 rose 10% on the previous year to 6,797; the highest since 1983. Nearly 60% of new cases of tuberculosis in England and Wales in 1998 occurred in people born in high prevalence parts of the world. In 2001 an outbreak in a Leicester school, though contained, demonstrated the need to maintain an effective tuberculosisservices, and to be aware of the global problems presented by the disease.

Food born infection continued to cause of concern. A survey of 70 general practices produced an estimate of 9 million cases annually, most of which were never seen by a GP. Campylobacter was the most common bacterial isolate, but in the majority of cases, none of the main food poisoning organisms was identified. Milk born coliform infection in Cumbria received national attention. In 2002 a substantial outbreak of Legionnaire's disease occurred also in Cumbria (Barrow-in-Furness), and Norwalk-like virus gastroenteritis affected a number of large cruise ships.

AIDS and sexually transmitted diseases

Internationally AIDS deaths were set to reach record levels, 21 million world wide up to 2000 according to the United Nations. In the west, the introduction of highly active antiretroviral drug therapy transformed the prognosis of HIV disease, dramatically reducing infections and mortality. However in the USA while infections contracted homosexually had been stable from 1992-1996, the rate of new infections doubled over the next four years. As a result of drug therapy more people were healthy and on the street, and people were less scared of unprotected sex. Complacency in the western world contrasted with the tragedies of sub-Saharan Africa, Asia and increasingly in China. In Thailand AIDS first affected homosexuals, then in turn drug users, prostitutes, their clients, the wives and girlfriends of the clients, and then the children of those women. In 1999, it became policy in England to offer HIV testing to all pregnant women. It was in the inner cities, with their multi-racial populations, that the incidence of maternal infection was at its highest. In 2000 & 2001 heterosexual sex became in England, the leading cause of new infections. Most of those diagnosed in the UK who have acquired infection heterosexually were not infected in this country. More than three quarters are recorded as having acquired infection abroad with almost two thirds of the total in Africa. The majority of the African infections were acquired in East Africa but the impact of the HIV epidemics in southern and western Africa was growing. Sixty percent of new cases were in London, and a fifth of these were in Lambeth, Southwark and Lewisham, more than half of these in ethnic minority populations. The patients might be on student visas, might be seeking asylum, or treatment not available in their own countries. CDSC data showed that many cases were in Black-African men and women. Increasingly HIV was drug resistant, 27% of new cases in 2000, compared with 14% in 1994.

In 2002 approximately 41,200 people are living with HIV in the UK, about 31% of whom are undiagnosed. Since the epidemic began in the early 1980s in the UK about 15,000 deaths are known to have occurred. Currently the number of people living with diagnosed HIV is rising each year due to increased numbers of new diagnoses and decreasing deaths due to antiretroviral therapies.

The consensus view was that in the treatment of established HIV infection initial therapy should include a combination of three potent anti-viral drugs; there might prove to be a case for four. Two new classes of anti-HIV drugs - entry inhibitors and integrase inhibitors - and a second generation of non-nucleoside reverse transcriptase inhibitors (NNRTIs) seemed in vitro and in early clinical trials to be effective against HIV strains that were resistant to currently available drugs. For the third world, new low cost drug regimes, administering zidovudine in pregnancy, offered a chance of reducing the infection of the foetus by the mother by some 80%; more intensive treatment from 28 weeks of pregnancy was even more successful. But though drug companies might offer special deals, funds were hard to find in the third world and some countries refused to fund the treatment of pregnant women.

Often as a result of casual sex, other sexually transmitted infections were also on the increase. The PHLS figures for 2000 showed that cases of gonorrhea in England and Wales were at their highest for over a decade; between 1999 and 2000 alone, new cases rose from 15,874 in 1999 to 20,190 in the year 2000, an increase of 27%. Until 1998, the number of cases of infectioussyphilis remained stable among both sexes in England,but then more than doubled between 1998 and 2000 (from 172 to372) in men and rose by 53% (102 to 156) in women. In 2000, 48% of syphilis infections in men were homosexually acquired.

Cases of chlamydia had been rising since 1993, partly as a result of improving awareness and diagnosis. Cases rose from 53221 in 1999 to 62565 in 2000, an increase of 18%. In males, the increase was from 22596 to 26877 (19%); and in females, from 30625 to 35688 (17%) In 2001, chlamydia became the most common sexually transmitted infection seen in clinics with a total of 71,055 diagnoses. Although chlamydial infection could later lead to pelvic inflammatory disease it frequently has no symptoms and passed undiagnosed, and it is likely that these figures still represent as little as a tenth of the real number of infections. Following pilot trials in which approximately 1 in 10 were found to be infected, a national chlamydia screening programme for women aged 16-24 using sexual health services was proposed, and ten centres were established for the next phase.

BSE

The Labour government established a judicial review of the handling of BSE, chaired by Lord Phillips, which reported in October 2000. Twenty eight Ministers, civil servants and scientists were criticized in the Report. A culture of inter-departmental dispute between the Ministry of Agriculture and the Department of Health, and unnecessary secrecy, was exposed. The public assurances of safety, given at the most senior level, had been revealed as flawed at the time they were given. The then Chief Veterinary Officer Keith Meldrum, and the then Chief Medical Officer, Sir Donald Acheson, were accused in the Report as glossing over potential health risks. ';Safe' seemed to mean different things to the public and to government officials. Senior Conservatives immediately accepted responsibility for the crisis and a government compensation scheme was introduced.

There were continuing cases of new variant Creutzfeldt-Jakob disease (vCJD), 3 in 1995, 10 in 1996, 10 in 1997, 18 in 1998, 15 in 1999 and by February 2003 a total of 130 confirmed and probable cases throughout the UK with 122 deaths had been reported. Infection was reported in cattle in France and Germany, and of vCJD in France. A Europe wide policy was now necessary.

Molecular strain typing and transmission studies confirmed that vCJD and cattle BSE were caused by the same prion strain. A cluster of five cases in a single small village, Queniborough, provided useful information. All patients had lived in the village between 1980 and 1991, and died between 1998 and 2000, providing some indication of the incubation period in their cases. Traditional butchery practices in small abattoirs at that time seemed the probable cause.

The possibility of a substantial epidemic in coming years remained in the absence of firm evidence of when the infective agent entered the food chain. Of those known to have been infected at least two had been blood donors and there was a small risk of infection from transfused blood. In 1998 this was reduced by purchase of some plasma products from the USA, and in 2002 Life Resources Inc., a US firm, was purchased by the Department of Health to secure the long term supplies needed by the NHS. In 1998 the blood transfusion service also started depleting blood to be transfused of white blood cells. The possible transmission of vCJD from mother to child, created yet another problem. The development of a better method of identification of infection in its earlier stages, by tonsil biopsy, opened the possibility of population screening to determine the incidence of infection. New guidelines in January 2001suggested that disposable surgical instruments should be used in operations such as appendicectomy and tonsillectomy in spite of the cost. Shortly after operative difficulties, and possibly deaths, followed their use and they were withdrawn. The first possibility of a breakthrough in vCJD came from the USA in August 2001, when it was reported that chlorpromazine and quinacrine (an anti-malarial) had an effect on prion proteins, preventing them from converting healthy prions into disease causing-forms. A UK trial of the new therapy was organised.

Compensation became available to people who had been infected with Hepatitis C from blood transfusions, developing liver cirrhosis, after the hazard was recognised, but before the introduction of routine screening in April 1991. A possible bill for the NHS of £10 million was in prospect. In October 2001 compensation was also offered to victims of vCJD and their families.

Screening

The National Screening Committee, established in 1996, was one of the bodies attempting to introduced evidence into medical protocols. In its first report in 1998 it identified almost 300 screening programmes, many at a research stage and nearly 100 in practice. Only four met stringent criteria for both quality and evidence of effectiveness, breast and cervical screening, and neonatal blood spot screening for phenyl-ketonuria and hypothyroidism. Earlier considered harmless, there was a growing body of evidence that screening could harm people, particularly because of false-positive and false-negative results. To Wilson's earlier criteria (see From Cradle to Grave) was added a new one, that there should be evidence from high quality randomised controlled trials that programmes were effective in reducing mortality or morbidity. In 2003 it was agreed to work towards a national screening programme for bowel cancer, the second most common cancer in men and women.

"Multi-phasic screening" as a form of health check, had been popularized in the USA in the sixties. Now a new form of "screening" emerged there, with the introduction of whole body spiral CAT and MRI scanning. Mobile units might offer cardiac, thoracic or abdominal scans. Other organisations provided Doppler ultrasound investigations to the worried well. There were, of course, some positive findings - for example young men with an operable but clinically silent cancer. Such procedures might take only a few minutes and cost $200 each, not an impracticable sum.

Alternative medicine

Complementary, or alternative, medicine remained in public demand. The main common factor seemed to be the time and patience of practitioners of alternative medicine, commodities in short supply in the NHS.

The Prince of Wales, like many of the Royal family, was a longstanding advocate for these therapies, or at least for research into their effectiveness. In November 2000 a sub-committee of the House of Lords Select Committee on Science and Technology, chaired by Lord Walton, reported that there was scant evidence that alternative remedies worked. Yet the public spent £1.6 billion annually and 50,000 practitioners were treating some 5 million patients. Only osteopathy, chiropractic and acupuncture were backed by scientific evidence. The evidence for herbal medicine was mixed, and that for homeopathy was anecdotal. The subcommittee was particularly concerned about dangerous and inaccurate information that appeared in some media articles, and on internet. Better regulation of practitioners, and control of misleading product labelling, were needed. In December 2001 the Department of Health expressed a willingness to consider the provision of some forms of complementary medicine within the NHS, subject to evidence of its effectiveness. This was generally lacking, indeed some complementary practitioners rejected the whole idea of assessment and trials. Regulating a range of groups some of whose philosophy inclined to the orthodox with a scientific base, or who claimed the healing traditions of the wise woman and earth-mother, of who wished to harness psychic energy, pyramids and force fields, would clearly be difficult.

Clinical specialties

Medical genetics

Genetic medicine was developing within the NHS. Academic centres had evolved into regional centres serving populations of 2 - 6 million. Clinical and laboratory services worked closely together, and developed "hub and spoke" systems with clinics in district hospitals. They provided access to the latest developments, clinical diagnosis, laboratory (DNA and chromosomal) diagnosis, genetic counselling and the care of extended families long term. In April 2001, to attempt to create a national approach on clinical genetics, the government announced the forthcoming publication of a green paper on genetics. Awareness that demand for services might exceed resources also led to the establishment by the Department of Health of the Genetics Commissioning Advisory Group, to develop ways to evaluate and set priorities for genetic technologies within the NHS.

The Human Genetics Commission (HGC) was created in 1999 to provide the Government with strategic advice on human genetics. Ministers asked the Commission to look at the wider social and ethical issues involved in the use of genetic data in insurance. The Commission was concerned that the results of genetic tests might be used by insurance companies to the detriment of the population were examined by the Department of Health's Genetics and Insurance Committee (GAIC). In October 2000 the committee agreed that to allow insurers in the UK to use genetic test results for assessing the risk of Huntingdon's disease. The Gene Therapy Advisory Committee (GTAC) was also established to advise on the ethical acceptability of proposals for gene therapy research on humans taking account of the scientific merits and the potential benefits and risks, and to advise on developments in gene therapy research.

The lengthy hunt for the structure of the human genome accelerated as commercial interests united with the international programme led by the American National Institutes of Health. A draft structure for the entire genome was announced in June 2000. Breakthroughs in basic science sometimes feed through to major advances in clinical medicine as did microbiology at the end of the 19th century, and later imaging and immunology. Genetics is fundamentally concerned with the cause of disease. Some diseases are wholly genetic in origin, for example cystic fibrosis. Others are largely environmental, for example asbestosis. But many common conditions are a combination, for example coronary heart disease and asthma. Knowledge of the human genome seemed likely to underpin a further advance in clinical care.

The development of pre-implantation genetic diagnosis (PGD) offered the possibility of ensuring that a new born child was not a carrier of some genetically determined diseases. Genetics also offered the possibility of early identification of those people likely to become ill because some disorders of adult life are preceded by a prolonged presymptomatic period, for example vascular disease and diabetes, even though environmental and dietetic factors may play a larger role. This opens the possibility of predicting and preventing disease, instead of diagnosing and treating it at a later stage. Genetics also provided an insight into the cause of many diseases at a molecular level, making the transition from merely describing a disease to understanding its mechanism. Diseases previously thought of as one, could be separated into categories with a different origin - and therefore treatment. For example a computer algorithm designed to seek out those breast tumours that had the most similar genetic profiles, and cluster them together, revealed that 98 cancers fell into two main groups that could be recognised on the basis of the activity of 70 genes. A woman who possessed a "poor" 70-gene signature would be 15 times more likely to suffer a recurrence within five years than a woman who had a "good" genetic signature and the second group might possibly be spared aggressive chemotherapy with all its side effects. The possibility also existed of developing drugs with an appropriate therapeutic action. Genetics provided the pharmaceutical industry with a wealth of new targets against which to design drugs; ';suddenly the industry went from famine to feast'.

In primary care as much as in the hospital service, an increasing number of conditions could now be seen in a genetic context.

Categories of genetic medicine relevant in primary care

productive risk for example, haemoglobinopathies, cystic fibrosis, muscular dystrophies, and many rarer autosomal recessive conditions; chromosomal disorders (such as Down's syndrome and Edwards's syndrome)

Adult onset genetic disorders with a mendelian inheritance pattern for example, Huntington's disease, subsets of common diseases such as familial cancers (BRCA1, BRCA2), maturity onset diabetes of the young

Common diseases with a multifactorial aetiology, for example, ischaemic heart disease, asthma, diabetesNormal genetic variations in drug metabolism and immune response

from Emery J, Hayflick S. The challenge of integrating genetic medicine into primary care.

BMJ 2001;322:1027-1030

The drug treatment of disease

The pharmaceutical industry was having an ever-increasing impact on health services. Increasingly health services were concerned with chronic diseases rather than the accent on acute illness of previous decades. Particularly in the USA ';disease management programmes', in which health services sometimes contracted the management of some chronic diseases to pharmaceutical companies, were adopted. In theory systematic, integrated evidence-based and long-term care of chronic high cost diseases such as asthma, rheumatoid arthritis and diabetes might be more effective. However, the management of specific diseases by a separate organisation risked the fragmentation of care, as patients with multiple unrelated pathology might be directed to specialised units.

Ask your doctor about...

A substantial number of patients, however, consulted their doctors as a result of advertisements that they had seen on TV or in the papers. In 1997 the US Food and Drug Administration (FDA) further relaxed the controls on Direct-to-Consumer Advertising (DTCA) advertisement of prescription drugs. DTCA was a powerful tool, designed to create demand and maximise profits by encouraging patient demand. Bob Dole, the former US vice-president appeared in the US on a TV commercial for erectile dysfunction - paid for by Pfizer. Some drugs now became household names, Viagra, Prozac for depression, Claritin for allergies, Rogaine for baldness and matrix for migraine. DTCA might be inaccurate; from 1997 to 2001, the FDA issued 94 notices of violations, mostly because benefits of the drug were hyped and risks minimised. In 1999 drug companies began ';public awareness campaigns' in England. These had little to do with health education, for the material and the advertisements were not about inexpensive diuretics, immunisation or cervical smears, but unsightly rashes and cures for baldness. In 2001 the European Commission proposed changes in EU law to allow DTCA for three disease areas, AIDS/HIV, diabetes, and asthma for a 5-year period followed by a review; the proposal was categorically rejected by MEPs the following year.

Internet also increased the availability of public knowledge of pharmaceutical products available and inevitably, there was increased pressure on doctors to prescribe the drugs publicised. The drug industry regarded Internet as an essential part of their DTC campaigns, and one aimed at people actively searching for information. (e.g. www.emc.vhn.net ) Such developments were slower in Europe, where governments inevitably wished to control drugs budgets.

The health service was being overtaken by an increasing number of expensive but clinically effective drugs sometimes "life-style" in nature. Prozac, HRT, and Viagra were examples of pharmaceutical advance that might improve qualities of life to which medicine had previously paid less attention. In January 2001 NICE approved three new drugs for the treatment of mild or moderate Alzheimer's disease. A new anti-obesity drug, Xenical, offered an alternative approach to a common problem by reducing fat absorption; NICE agreed in March 2001 that it could be prescribed under the NHS when patients were motivated to lose weight and obesity was significant and posing a threat to health. In April 2002 NICE recommended the use of bupropion (Zyban) and nicotine replacement therapy (NRT) for smokers who wished to quit.

The decade also promised a new round of expensive yet effective drugs, for example statins for coronary artery disease and stroke. Adults with diabetes of the insulin resistant type stood to benefit from a new class of drugs, the thiazolidinediones. Organ transplantation became more reliable with the development of new immune-suppressers. New drugs were available for schizophrenia, which had been adopted rapidly in North America and Scandinavia.

Drug resistance

Bacteria are adept at developing drug resistance, and this ever-increasing problem, led to increased alarm. The Standing Medical Advisory Committee (SMAC) reported upon it; up to 75% of antibiotic use seemed of questionable value, yet there seemed to be inevitability to the problem. Society demanded easy answers and there was increasing use of broad-spectrum antibiotics. People might be treated inappropriately or too long; antibiotics were used on fruit-trees and salmon farms. The most vulnerable members of society were often crowded together, and while there are worldwide pressures for greater efficiency in health systems with higher bed occupancies, and stretched nursing and medical care, hospital bed occupancy is higher in the UK than in most developed countries. With patients waiting in A & E, some with infected wounds might be admitted to wards such as cold orthopaedic wards which should certainly have been kept bacteriologically clean. In the 19th Century hospital infection was worst in the busiest hospitals, and history was repeating itself. Yet, the essentials of control are well known, reducing antibiotic use, improving hygienic measures and hospital cleanliness. The National Audit Office reported that dirty hands and unsanitary conditions in hospitals caused 5,000 deaths a year and over 100,000 inpatients became seriously ill with infections, costing the NHS £1 billion a year.

Staphylococcus resistance had been a problem in the 1950s-1960s, then becoming far less of one. In the eighties infection by methicillin resistant Staphylococcus (MRSA) began to increase once more, and by 2000 roughly a third of strains isolated in hospitals were resistant. Some now showed resistance to the only remaining antibiotic, vancomycin. The number of patients from whom MRSA had been isolated had multiplied more than ten fold over ten years and 2000 marked yet another annual increase in the proportion of Staphylococcus aureus bacteriaemias caused by MRSA. Voluntary surveillance of hospital-acquired infection had long been undertaken; government made it mandatory in April 2001. The UK is one of the European countries with the highest rates of MRSA.

Until the 1980s a steady stream of new antibacterials had become available. The 1980s saw little new investment in them, but that began to change. The continuing emergence of resistance strains established the need, and three approaches were followed, the modification of existing agents, genomic approaches and vaccine development. Agents such as the fluoroquinolones, active against anaerobes and streptococci, began to appear. In January 2001 Zyvox, a new antibiotic for the treatment of resistant organisms including methicillin-resistant Staphylococcus aureus (MRSA) was approved for hospital use. With a multiplicity of antibiotics, and ever increasing microbial resistance, accurate information on antibiotics and their proper use became essential. A free peer-reviewed database was provided on the Internet by Johns Hopkins, Baltimore.

Radiology and diagnostic imaging

Computed tomography was developing rapidly. In the late eighties the rotation of the x ray source was combined with continuous movements of the table on which the patient was placed. Because the tube was rotating while the subject moved smoothly through the scanner, the x ray beam described a spiral pathway. This meant more rapid scans, closer spaced scans, and a scan within a single breath-hold, giving three-dimensional images. Images were better providing new applications for the imaging technique. For example, the colon could be viewed in exquisite detail. Next, multi-slice scanners were introduced with not one row of detectors, but up to eight. Image acquisition was even faster and a larger area could be covered. The newer scanners were inevitably more expensive, but there was a danger of rising radiation dosages.

In 2000, Nutt and Townsend working first in Geneva and later in Pittsburgh, combined computerized tomography scanners (CAT) with positron emission tomography (PET). The result was a single machine that could simultaneously image anatomical structure, for example cancerous tumours, and metabolic processes, reducing the time for a diagnosis and reducing patient discomfort. Conformational radiotherapy, which enabled accurate dosage to be given to precisely the required field even if it was irregular in shape, enabled higher doses to be given with less tissue damage and, hopefully, better outcomes. "Intensity modulated radiotherapy" enabled different doses to be administered to different parts of a tumour. Electron beam imaging was also under development.

Progressive development of imaging systems also aided the development of minimal access surgery. Image-guided surgery used imaging systems during a surgical procedure to assist its performance. Magnetic resonance imaging was the best technique, but it had not been practical because the machines had been fully enclosed. The first truly open scanner was installed in Boston in 1994, but in 1999 one was installed at St. Mary's Paddington. Surgeons had full access to any part of the patient's body, while simultaneously that part was scanned. Endoscopic views could be combined with MRI images, and important structures could be identified and safeguarded during surgery.

Surgery

Surgeons and surgery figured frequently in the medical issues covered by the media. Sir Barry Jackson, when President of the Royal College of Surgeons, was frequently to be seen on TV, providing quiet and thoughtful comment on cases of clinical misadventure, for example Bristol.

Cosmetic surgery was becoming a growth industry, and in some circles an obsession. Both in the UK and the US the demand was growing rapidly. While the NHS was providing, if anything, less of a service and therefore less training for young surgeons, the private sector mushroomed. Professionally, the main controversy was who should be regarded as qualified and competent to carry out plastic surgery, for some of those operating were clearly lacking appropriate skills. The use of botulinum toxin (botox) to reduce wrinkles was popular; in the US it was approved by the Food and Drugs Administration for the removal of wrinkles and Botox parties were commonplace. In the UK Boots planed to provide facilities in some of its larger stores. Collagen fillers were widely used and sugar molecule injections looked set to become the next hot cosmetic item.

Minimal access surgery

Minimally invasive surgery continued to progress facilitated by improvements in miniature video cameras producing good images so the operator and the assistant could work together. Virtual reality simulators became available for training. Laparoscopic cholecystectomy was becoming the technique of choice.

New surgical procedures, unlike new drugs that were subjected to clinical trial before licensing, might be developed anywhere in the world and introduced by a surgeon locally without much in the way of formality. Indeed, many operations - for example those on the heart - would never have been developed if the high mortality among the earlier cases had been considered as would have been the case for a drug. Operative procedures were now considered by the National Institute for Clinical Excellence (NICE). Twenty years previously minimal access surgery had been hailed as a major advance, saving the patient pain and reducing the length of admission. Subsequent reassessment revealed the complication rate of such operations, and the techniques of open operations were improving in any case, as in the case of inguinal hernia. NICE reviewed minimal access surgery for this condition of which there were 100,000 new cases each year. NICE recommended in December 2000 that laparoscopic surgery should not be used for colo-rectal cancer, and in January 2001 that people with first time hernias of the groin should have ordinary (open) surgery rather than a minimal access procedure. For the repair of hernias that reoccurred or were on both sides minimal access surgery should be considered, a totally extraperitoneal procedure, which did not involve opening the peritoneum, was preferable. It was recommended that such surgery should only be undertaken in surgical units with appropriately trained operating teams that regularly do these procedures. The open procedure was, in general, cheaper. Centrally formulated guidelines were increasingly affecting the decisions of surgeons.

Virtual reality in surgery

New technologies developed in the eighties and nineties, in particular virtual reality and robotics, were coming to practical fruition. In general surgery simulators were introduced, for example to train and assess new professionals. Neurosurgeons were using image guided surgery and augmented reality. "Master-slave" robotic procedures were used for minimally invasive coronary artery bypass grafting and laparoscopic surgery.

Fast track surgery

Newer approaches to pain relief, better - sometimes regional - anaesthesia, minimally invasive techniques, optimal pain control and aggressive postoperative rehabilitation, were reducing the patients' responses to stress, shortening the recovery time. As a result earlier discharge was possible, and fast track surgical units extended the longstanding achievements of day surgery. Operations such as splenectomy, vaginal hysterectomy, and mastectomy were becoming possible on a day or 24 hour stay basis. Purpose-designed fast-track surgical units were under development, and these were seen as one way to reduce lengthy waiting lists for treatment.

Orthopaedics and trauma

Major accident units had, sadly, a number of occasions on which their skills were required, including railway disasters. Increasing numbers of gunshot injuries in London, and some other big cities, related to illegal drugs, sent trauma surgeons in search of the training available in countries with longer experience in this field. In the wake of the terrorist attacks on the World Trade Center hospitals reviewed their emergency planning.

The improvement of imaging, for example MRI, had a substantial impact on orthopaedic practice. Musculo-skeletal imaging, for example of the knee joint, make greater accuracy possible in the assessment of suspected cartilage and ligament injuries, often substantially altering the optimal treatment.

Trauma and orthopaedics that had seen many changes over the previous 20 years looked set for greater advances as a result of the development of new materials, computer aided manufacturingtechnology, and molecular biology. From the physiological standpoint, improved understanding of the way in which the body responded to major trauma and severe multiple injuries led to the introduction of new methods of managing them. Sometimes it appeared best to intervene less in the immediate phase after injury. Rapid restoration of fluid volume and blood pressure might lead to further catastrophic bleeding; hypothermia could sometimes protect from brain and tissue damage. New drugs were introduced to reduce the likelihood of multiple organ failure in the weeks after injury.

Roughly 50,000 hip arthroplasties were performed annually, mainly for osteoarthritis. Younger patients, leading an active life, were likely to wear out the replacement hip, and NICE suggested that they should be considered for a metal-on-metal resurfacing arthroplasty, in which the femoral head and the joint were both fitted with new metal wearing surfaces.

Femoral fractures in children had traditionallybeen treated with traction and hospital stays of 4-12 weeks. A new technique, flexible nailing of the femur, developed in Switzerland, allowed earlymobilisation. Flexible nails produced from new metal alloys weresmall enough to fit the intramedullary canal in children but werealso able to maintain their shape after contouring and were strongenough to provide stable fixation. They were inserted through the skinthrough a 5 mm incision. The length of time spentin hospital was reduced allowing earlier weight bearing, movement,and return to school, in addition to fewer complications. The nails were removed after the fracture had healed.

Sports injuries were an ever-growing part of the work of accident and orthopaedic departments. While in the seventies and early eighties knee replacement was widely considered a poor operation, by the nineties the basic principals of successful surgery had evolved. The joint would be re-surfaced, reproducing the normal anatomy with a low friction joint, the remaining ligaments providing stability. Some 35.000 operations were now performed per year, with about a 90% success at ten years.

Many injuries, including sporting injuries, damaged articular cartilage which had poor potential for repair. Damage might lead to arthritismany years after injury. Transplantation of hyaline cartilage had been used for a numberof years, but there are few sites where donor articular cartilagecould be harvested without damaging the joint. So only smalldefects can be treated with this method. However, anew patented technique, autologous chondrocyte transplantation, allowed small amounts of hyaline cartilageto be harvested, the chondrocytes extracted, and the cell populationincreased in tissue culture. The number of cells increases byabout 15 times over four weeks. These cells could then be reimplantedbeneath a periosteal patch which is sutured over the articulardefect.

Cardiology and Cardiac Surgery

The mortality and morbidity of heart disease ensured it a significant place in plans for improving the health of the population. Many of the common problems in clinical practice relate to thrombosis. The underlying pathophysiological process in myocardial infarction and stroke is thrombus formation. Common cardiovascular disorders such as atrial fibrillation and heart failure are also associated with thrombogenesis. Thrombosis is also a clinical problem in various cancers and after surgery,especially orthopaedic. Anti-platelet therapy, for example aspirin and a wide range of other more recently developed drugs, were shown to protect against such problems in patients.

The evidence of benefit to patients with high blood cholesterol levels and known atherosclerotic disease from the use of cholesterol lowering drugs, for example the statins, had been clear since the mid 1990s. Yet although they were easy to take, effective and comparatively free from side effects, they remained underused. Less than a third of patients who had a history of coronary artery disease or stroke received lipid lowering treatment. Deep vein thrombosis, for long recognised as a hazard of bed rest, came to public attention following the death of a young passenger. This "economy class syndrome" seemed associated with lengthy flights, alcohol consumption and failure to move about the cabin.

Professionally inspired and governmentally encouraged, a National Service Framework for Coronary Heart Disease was introduced in 2000. The framework outlined current good practice in hospitals. Specialist smoking cessation clinics, rapid access to chest pain clinics, rapid thrombolytic treatment, shorter delays for assessment and treatment, more effective use of aspirin, beta blockers and statins after a heart attack and more coronary artery surgery were required. Guidelines for preventing cardiovascular disease, for example by blood pressure reduction, were further refined. Ambulance services, under pressure, nevertheless had difficulty in reaching people as fast as necessary. Because of the survival advantage of rapid defibrillation, automatic defibrillators safe in the hands of lay people began to make their appearance in public places, for example in aircraft; in 2002 they went on sale in the USA, with FDA approval, for about $2500. The British Heart Foundation and the Department of Health also began to install them in high risk locations.

Newer imaging techniques, such as stress echo testing, MRI, and ultra fast CT scanning for coronary calcium brought benefits. Drug treatment of heart disease became more precise. The prognosis of congestive heart failure was improved by the use of ACE inhibitors. A new class of drug, vasopeptidase inhibitors, were shown to be at least if not more effective in the treatment of high blood pressure, cardiovascular and ischaemic heart disease, than the existing ACE inhibitors. It was recognised that unstable and acute coronary disease could be due to a vulnerable plaque, which might not tightly narrow a coronary artery but could be affected by various risk factors. There was often dramatic benefit from the use of anti-platelet and thrombolytic agents in the treatment of coronary disease, and major benefits from lipid lowering drugs on death from coronary disease. It became recognised that some drugs used to treat disturbances of heart rhythm could themselves have dangerous effects. However the impact of the automatic cardioverter defibrillator in preventing sudden death could be life-saving, and the treatment of disorders of heart rhythm, particularly in younger patients, was improved substantially by the introduction of catheter ablation, in effect the accurate destruction of an area of the heart responsible for the disturbance of rhythm, through a catheter within the heart

Coronary artery surgery continued its development; two main interventions were available for opening up blocked coronary arteries: balloon angioplasty and open heart surgery.Percutaneouscoronary angioplasty usually requires one or two days in hospital, and patients can expect to be back at work within a week. Coronary artery bypass grafting (CABG) is more invasive and requires lengthyrehabilitation. In angioplasty a wide lumen catheter isfed from the groin up to the aortic root and into the coronaryarteries. A guide wire is passed through the catheter and acrossthe stenosis in the coronary artery. The wire guides a balloon (with a stent mounted on it if necessary) into the diseased section of the artery. The balloon is inflated, pushing the atheroma outwards and enlarging the lumen of the artery. Once the stent is in place (confirmed by angiography), the wires and the catheter are removed. Arterial restenosis remains a problem after percutaneous coronary angioplasty tending to occur within three months, due to proliferation of smooth muscle as a reaction to vessel injury. Percutaneous transluminal coronary angioplasty (PTCA) and coronary artery bypass grafting (CABG) are both used in the management of multivessel coronary disease, and rates of subsequent death or myocardial infarction are similar after either strategy. However, repeat revascularisation is more often required after PTCA than after CABG, though this risk can be reduced by use of coronary stents as adjuncts. Restenosis used to occur in over 30% of patients but with the use of stents, advances in stent design and improved techniques for implanting them the rates now lie between 10% and 20%. This is comparable to the 10% of vein grafts that are lost in the year after bypass grafting. Newer stents, expensive and not yet in general use, are made of metal coated with a cytostatic agent such as sirolimus or paclitaxel. These agents are released slowly and locally to reduce proliferation of smooth muscle. Early trials with these stents suggest that less than 5% of people will have arterial restenosis. The Stent or Surgery (SoS) trial (2002) was designed to assess the effect of stent-assisted percutaneous intervention (PCI) by comparison with CABG in the management of patients with multivessel disease. The results showed that coronary stents did reduce the need for additional revascularisation procedures, though the rate was still higher than in the CABG group.

Angioplasty during an acute heart attack appeared to have better outcomes than thrombolytic therapy. Minimally invasive operations were introduced with smaller incisions or there might be no cardiopulmonary bypass, the surgeon performing the anastomoses on a beating heart by the use of a platform or stabilising system.

In September 2000 Oxford announced the insertion into the heart of a small electrically powered pump, the Jarvik 2000, which successfully maintained blood flow. In spite of clinical advances, however, the length of waiting lists meant that some who would have benefited from surgery had died before they could be admitted. To increase surgical capacity it was suggested that nurses should be trained to undertake a substantial part of the operative procedures.

Organ transplantation

The results from organ transplantation steadily improved with advances in immunosuppressive therapy. After tacrolimus, mycophenolate mofetil (MMF) was marketed to reduce the risk of acute rejection and monoclonal antibodies were also increasingly used. The survival rate for renal transplantation was now 86% at one year and 76% at five years. The new drugs might be expensive, but they could reduce the risk of rejection, the resultant morbidity, and the additional costs of further treatment. However, 2,500 people in the UK went into end-stage renal failure, far more than the supply of organs, so waiting lists for transplants steadily rose. For other organs the dearth of donors was even greater. As a result of the increased use of seat belts, and the better treatment of subarachnoid haemorrhage and strokes the number of organs available for transplantation fell throughout the decade. Most potential donors were to be found in critical care units, and there were proportionately fewer of these than in many other European nations. In 1999 the BMA voted in favour of an ';opt-out' system so that the organs of those dying were automatically available unless there was a written statement to the contrary. Transplant surgeons, however, preferred the existing 'opt-in' approach, fearing public reaction to a more radical policy. The Alder Hey report on retained organs (2001) appeared to reduce the number of those prepared to consent to organ donation (even for example corneal replacement). Alan Milburn (Secretary of State), who had over-dramatised the problems at Alder Hey and was held to blame for this, set a five year target for a greater number of donated organs without there being any sign of improvement. .

Since 1995 there had been a significant fall in the number of patients receiving new hearts, lungs, or heart and lungs through transplantation, the direct result of fewer suitable organs due to improvements in road safety and treatment of trauma. In 2000 265 patients in the UK were treated and there were six heart and lung transplant centres in England (Birmingham, Cambridge, London, Manchester, Newcastle and Sheffield). The need to ensure that centres undertook enough operations to maintain expertise was appreciated, and a National Specialist Commissioning Advisory Group took on responsibilities for commissioning heart and lung transplants for both adult and children.

The techniques necessary for the re-attachment of limbs had already been developed and in 1998 the first hand transplant was carried out in France, a second - more successful - was undertaken the following year in the USA.

Stem cell transplantation was revolutionizing the outcome of a range of malignant and non malignant blood disorders, including immunological diseases. Stem cells could be obtained from blood, bone marrow and umbilical cord blood. Cord blood banks were established in London, Bristol, Belfast and Newcastle to collect, preserve and type blood products, and to test for viral contamination.

Neurology & neurosurgery

Neurological disorders account for 10-20% of hospital admissions, stroke being one of the commonest causes. Most are treated by general practitioners or general physicians, rather than by neurologists with easy access to precise imaging systems and vascular surgery. Only 35 hospitals have a neurological centre or a neurology/neurosurgery centre. 200 district general hospitals do not have a 24 hour on call neurological service, usually relying on visiting neurologists from elsewhere. Improvements in treatment were therefore relevant to many doctors without specialist training. For example the outlook of patients with stroke is improved by very early thrombolytic therapy if intracranial bleeding could be ruled out.

Many new drugs were being introduced for epilepsy, not all of which appeared to be better tolerated or more effective. For temporal lobe epilepsy, surgery emerged as the most effective treatment. Costly drugs, which only had a slight effect on diseases otherwise difficult to treat, were a particular problem for neurologists, for example the interferons in multiple sclerosis and drugs in dementia. Trials were underway in motor neurone disease.

Some rare neurological diseases had long been known to have a genetic cause; now gene mutations were discovered that increased the risk of developing a common one, Alzheimer's disease. As in other fields of medicine, there was hope that the identification of causal factors at a molecular level would open the way to forms of treatment that would influence the course of the disease. As brain cells did not appear to regenerate, it would be important to develop an early diagnostic system, so that treatment, when available, could begin as soon as possible. The newer systems of imaging, for example positron emission tomography (PET), sometimes gave indications of how a neurological disease was developing, and how brain function was affected.

Advances in neurosurgery continued to be driven by technology. Frameless stereotaxy linked information from CAT or MRI scans, through computer linkages to sensors that could ';know' where the skull was. Magnetic resonance imaging could be used to display blood vessels including the carotid arteries (MRA). Stereotaxic systems helped the surgeon to navigate safely through high-risk areas of the skull and brain, knowing exactly where the surgical instruments were. Interventional magnetic resonance imaging provided another possibility, with enough space within the scanner for the patient to move and for some neurosurgical procedures to be carried out.

Ophthalmology

Progress in ophthalmology was slow but steady. Artificial lens implants had revolutionised cataract surgery. Foldable intraoccular lenses could be inserted through a small self-sealing incision. Better drugs became available for local treatment of glaucoma, and endoscopic laser techniques also helped in its treatment.

Cancer

World-wide, the occurrence of cancer steadily increased - populations were often on average older and other causes of mortality were being attacked. Tobacco related cancer was increasingly common in the developing countries as cigarettes were vigorously marketed. Many cancers were related to diet, but precisely to what dietary habits in specific countries was unclear. Infections, for example Hepatitis B and Helicobacter, were also responsible. Genetic factors were being discovered, for the risk of cancer is greater among family members of people with cancer. Some families had a very high incidence of particular cancers and specific gene faults could be identified. The human genome project increasingly provided new evidence on causation. A small risk of leukemia in children seemed to be due to proximity to high voltage pylons.

Survival from cancer in the UK was lower in the UK than in most European countries, for example France and Germany, and the USA. Access to diagnostic services and staffing levels were poorer, there were fewer oncologists and 40% of cancer patients never saw one. While a consultant at the Royal Marsden wrote to The Times about the huge advances made in cancer care in the NHS over the previous 30 years, colleagues elsewhere replied that attempts to achieve what was accomplished at the comparatively well-resourced Marsden were viewed askance by NHS management, who asked why so many scans were being performed, and expensive drugs being prescribed (January 2002). Government, through the NHS Plan of July 2000, and a National Service Framework for cancer services, aimed to improve matters such as staffing levels and service organisation. Nine cancer networks, covering 15 million people, aimed to improve the experience and outcome of patients with suspected or diagnosed cancer. Stimulated in part by the length of waiting lists, government also pledged that patients with suspected cancer would be seen within two weeks of referral by a GP. However a report from the National Confidential Enquiry into Perioperative Deaths (November 2001) found that standards of care were variable and too few patients, particularly when admitted as an emergency, saw a cancer specialist. Emergency admissions were older and sicker, were often treated by general surgeons and there might not be enough information on how far the tumour had spread, essential for the planning of further treatment. An almost simultaneous report on NHS Cancer Care in England and Wales from CHI and the Audit Commission showed that the network pattern for cancer services laid down in 1995 by the Calman & Hine was in many places far from implementation.

There was some slow improvement, statistics showing that survival rates for people with cancer were improving substantially, particularly for cancer of the breast, colorectal cancer, non-Hodgkin's lymphoma and leukaemias. For men in early middle age, a paper of which Professor Sir Richard Doll was a co-author showed that the prevalence of smoking had halved between 1950 and 1990, and the death rate for lung cancer at ages 35-54 fell even more rapidly. However women and older men who were still current smokers in 1990 had higher rates than in 1950. In 2000, the deaths of women from cancer of the lung exceeded those from cancer of the breast for the first time. Though the British survival rate for breast cancer was poor in European terms, deaths in England and Wales fell 21 percent between 1990-1998 because of better treatment, including the use of Tamoxifen, (a 15% reduction) and (although this was in contention) the national screening programme for older women. The Institute of Cancer Research believed that there would be a further decline in deaths because of the programme but a Cochrane review (2001) claimed that there was no reliable evidence to support the value of mammography screening in reducing deaths frombreast cancer and it simply increased ratesof breastsurgery. BMJ 2001;323:956 ( 27 October )

The trend to conservation surgery continued, driven by technological improvements, clinical trials and patient preference, although in the case of women with genetic mutations predisposing to breast cancer (BRCA1 & BRCA2) prophylactic mastectomy increased survival rates and Tamoxifen might also have a protective effect. The case for screening for colorectal cancer grew stronger. Better imaging improved radiotherapy. The major problem in cancer treatment was undetected spread at the time of first treatment. While there was an explosion of information about the molecular biology of cancer, the dramatic successes that had been achieved in some rarer cancers were not repeated in the commoner ones, breast, lung or colon. Chemotherapy and gene therapy remained the hopes for the future. Controlled trials improved the results of chemotherapy as new agents were introduced, as for example in cancer of the breast (anastrozole and letrozole) and colorectal cancer. Some of the newer drugs were used mainly to extend the survival of people with terminal cancer, although when their effectiveness became apparent they might be used earlier, as in the case of paclitaxel (Taxol) in ovarian cancer. Taxol, one of the earlier drugs to be assessed by NICE, was recommended as standard initial therapy. Later, in 2002, trastuzumab (Herceptin) was accepted for advanced breast cancer, as it targeted a protein on the surface of fast-growing cancer cells. So was oxaliplatin (trade name Eloxatin) & irinotecan (trade name Campto) for metatstatic colorectal cancer. Three drugs for small cell lung cancer were also approved, gemcitabine, paclitarel and vinorelbine.

Monoclonal antibodies, after many years, began to live up to some of the expectations. There was rituximab (Mabthera) for low-grade non-Hodgkin lymphoma. The antibody attached to the B-cell surface receptor, present in most cases. Another was trastuzumab (Herceptin), active in some breast cancers. There were many more monoclonals in the pipeline, each active against a receptor on a malignant cell surface. It was a growth area with high costs. Another group of drugs active against cancer were anti-angiogeneis drugs, which keep tumours from developing good blood supplies. At a research level many more drugs were being tested than ever before, and cancer treatment was becoming truly exciting.

The development of a vaccine against infections with human papillomavirus type 16 (HPV-16), which though often benign could progress cervical and anogenital cancer, opened the possibility of preventing a substantial number of cases.

Radiotherapy centres were increasingly well equipped. The purchase of Linear Accelerators resulted in many having "multi-leaf" collimators for the fist time, helping to reduce the volume of treatment irradiated, so sparing normal tissue around the cancer. A new advance in radiotherapy, CHART - Continuous Hyperfractionated Accelerated Radiotherapy (CHART) was introduced. Radiotherapy was given for 12 successive days, including weekends, 3 times each day, with a 6-hour gap between treatments. The total dose was higher, and there might be side effects, but in lung cancer treatment the "cure" rate seemed better.

The diagnosis of cancer of the breast was an area in which errors were regularly made. A new approach, skin-surface testing, provided a new non-invasive approach. Research workers found a highly significant trend of progressive electrical changes according to the proliferative characteristics of biopsied breast tissue. Trials of the method suggested it had a high degree of accuracy and could be repeated regularly without discomfort to the woman.

Paediatrics

The quality of services for children, both in hospital and in the community, had been a regular concern, for example there had been the Court Report in the sixties, and the preceding concern for children in hospital. In the wake of the NHS Plan a Children's Task Force was established in 2000, and a 'child health czar' was appointed the following year to improve the 'fragmented and poorly coordinated' services that children might receive.

An unexpected problem was the emergence in England, as in the US, of increased obesity in childhood, particularly after the age of nine. Lack of exercise, sedentary pastimes such as computer games, and fast food seemed responsible. As a result, in Britain as in the US, doctors increasingly saw illnesses more characteristically those of adults, type II diabetes, and even heart disease. The incidence of autism was also rising and plans were made to achieve earlier diagnosis by a national screening campaign in infancy.

Clinical genetics was increasingly applied to diagnosis and treatment. The ability to screen, in vitro, for the inherited Fanconi syndrome enabled a clinical tour de force - the selection of one among a number of embryos for implantation that was not only unaffected by the disease, but which led to the delivery of a baby whose cells could be used to treat an elder and sick sibling.

The increasing ability to diagnose fetal defects by antenatal ultrasonic scans, and the emergence of feto-maternal medicine as a specialty, meant more work for paediatric surgeons. Spina bifida, kidney and bladder diseases might be diagnosed before birth. Surgery immediately after birth could be planned, and where the defect might result in the death of the fetus or neonate, fetal surgery was sometimes possible. It has become an international endeavour,with nearly a dozen centres worldwide. Also until recently, only fetuses with life threatening defects were considered candidates for prenatal correction. Now fetal surgical procedures are being performed for non-lethal conditions. Despite the expansion of fetal surgery, ethical considerations still linger. Principles remain the documentation of the natural history of the untreated disease in utero, sound pathophysiological reasons for treatment before birth, demonstration of safety and efficacy of the fetal procedure in animals, and development of criteria for treatment. Rigorous groundwork has been accomplished for several anomalies that are amenable to fetal surgical intervention for example severe congenital diaphragmatic hernia. Minimally invasive techniques were often preferable to an attempt to replicate the procedure that would be undertaken postnatally.

Organ transplantation in children presented difficulties not found to the same extent in adults. For example drugs used to suppress immune reactions might affect growth. Nevertheless advances in medical knowledge and surgical technique extended the range of indications and improved both survival and the quality of life. The outlook of children dying of liver, kidney or heart failure was revolutionised. It became possible to transplant kidneys at younger ages with a good chance of success. Liver transplantation was extended to the neonatal age group. Congenital heart disease and cardiomyopathy could be treated by heart transplantation. Paediatric transplantation was a victim of its own success. Improved survival led to increased referrals. In 1997 350 children under the age of 18 underwent transplantation, there were 221 on the waiting lists, and 28 had died while waiting for surgery.

Obstetrics & Gynaecology

Despite decades of accumulated observational evidence, the balance of risks and benefits for hormone replacement therapy (HRT) in healthy postmenopausal women remained uncertain. A US trial of a combined oestrogen/progesterone preparation was stopped in May 2002 after an average 5.2-year follow-up among healthy postmenopausal US women, because risks (for example invasive breast cancer, heart disease and stroke) exceeded benefits, e.g. a reduction in fractures. (JAMA. 2002;288:321-333). Over 2 million women were reported to be taking HRT in Britain and an MRC trial of long duration oestrogen after menopause was underway; it was decided in November 2002 to bring this trial to an end as well.

Obstetricians faced the problems of rising demand for care involving issues of fertility. Ethical issues abounded. From January 2001 the "morning after" pill became available over the pharmacist's counter, rather than through medical channels.

There was an ever increasing demand for fertility treatment, often in women who had delayed pregnancy until their mid-thirties. By 1998 more than 25,000 women a year were having IVF or intra-cytoplasmic sperm injection. NHS resources being limited, the majority of treatment this was undertaken in the private sector at great cost to the individuals concerned . To increase the success rate more than one embryo was often implanted and the numbers of multiple births to women aged 35 or more increased rapidly. As these deliveries were usually undertaken within the NHS, and many of the babies were low-weight and delivered early, the consequential costs to the NHS were considerable.

New technologies allowed a measure of baby sex selection, at a price in the private sector, but increasingly within the NHS. The Human Fertilisation and Embryology Authority (HFEA) decided to allow selection of an embryo so that a baby could become a donor for a brother seriously ill with the inherited blood disease thalassaemia. Surrogacy might be used not for reasons of infertility but social convenience. International travel made it possible for those with the money to evade national legal controls.

Improvements in imaging, endoscopy and drug treatment all contributed to steady advance in gynaecology. Fibreoptic endoscopes enabled the replacement of some major operations by minimally invasive procedures. Ectopic pregnancy could be diagnosed early by ultrasonics, and treated by laparoscopic surgery. New approaches to the treatment of heavy menstrual loss were developed, for example endoscopic ablation of the endometrium by laser. Simpler methods of treating endometrial polyps were also possible. Better understanding of the risk and frequency of incontinence after delivery, and the technique of the repair of tears, offered more hope to women suffering such embarrassment.

Concern about injuries to the pelvic floor was among the reasons for rising Caesarian section rates. While as a surgical procedure it carried its own risks, neither doctors nor patients resisted the pressures sometimes present to do everything possible to ensure a good outcome for the baby. To try to produce good evidence about the incidence and advantages of section, the Royal College of Obstetricians and Gynaecologists undertook a large-scale survey; the section rate was 22% and many obstetricians believed this to be too high.

Problems with reading cervical cytology slides emerged from time to time. Even in units recognised to be of high quality such as Leicester, the inherent difficulty in reading test results led to audits that showed repeated failures to identify abnormal smears.

Geriatrics

The media had drawn attention in the sixties to the poor standards of care the elderly received. Although the elderly were now usually admitted to acute wards, the care they received was still of questionable quality. Stimulated by a series of articles in the Observer in September-October 1997 that raised issues about poor basic nursing care and lack of equipment, the Health Advisory Service (HAS), established a project to examine the care of elderly people in acute wards in 16 general hospitals throughout England. The fabric and design of the wards was often poor, equipment might be lacking, ward routine might be inflexible and patients might not be helped to eat or drink. Long delays for emergency admission, poor quality food and problems with privacy and dignity were also identified. On balance the HAS preferred specialised facilities for the elderly, rather than those integrated with other facilities. They asked for national standards, that older people should be helped to eat and drink, should lie in a clean, dry bed, and be treated with respect. Ministers agreed that such standards should, over the next few years, be introduced. Yet in March 2000 the new Commission for Health Improvement had, as one of its first tasks, to act to improve care in Cumbria, where the North Lakelands Trust had been shown to fall far short of acceptable standards in its care of the elderly. In March 2001 the Standing Nursing and Midwifery Advisory Committee's Report "Caring for Older People : A Nursing Priority" was published, motivated by mounting evidence that older people did not always have access to acute care that meets all their needs. The report, which contained many thoughtful conclusions and recommendations, said

"There is a great deal of evidence to support the conclusion that the care that older people receive often fails to meet their most basic needs for food, fluid, rest, activity and elimination and the psychological and mental health needs of older people are often entirely neglected in acute health care settings. The nursing care of older patients is mainly deficient in terms of fundamental skills, such as communication and helping a patient to maintain their nutritional status, skin integrity and continence."

Simultaneously the Department of Health published a new National Standards Framework covering "age discrimination", person centred care, intermediate care, general hospital care, specialised stroke services, falls, mental health and the promotion of an active healthy life. Once more there were promises of improvements, patient "champions" and future investment in ward upgrading.

Labour, before their general election victory in 1997, had accused the Conservative Government of forcing thousands of pensioners to sell their homes to pay for long term care. The Royal Commission on Long-Term Care for the Elderly (1999) proposed making all nursing and personal care, including help with washing and dressing, free to all who were assessed as needing it. The Government was cool towards the solution, with its immense costs and its response was delayed over a year, until July 2000, when it was published along with the NHS Plan. Personal care would not be free in England but the devolved Scottish parliament decided that it would be, opening up a division in health care provision within the UK. Nursing care, on the other hand, defined as anything that a registered nurse provided or supervised, would be free and the value of the patient's home would not be taken into account for the first three months. Within a budget inevitably limited by public finance, a compromise had to be made between idealistic aspirations - totally free medical and social care for the elderly - and what could be afforded within budget. Legislation in the Health and Social Care Act (2001) made it possible to charge for "intermediate care" after the first six weeks.

In March 2001 the Government published a new national service framework, for the care of the elderly, and Professor Ian Philip became National Director of Older People's Services (the older people's Tsar). The national service framework for older people aimed to: root out age discrimination in the NHS, provide person centred care, provide intermediate care to prevent unnecessary hospital admissions and to ensure timely discharge from hospital, ensure older people receive the specialist help they need in hospital, reduce the incidence of stroke, reduce the number of falls, promote good mental health and promote health and active life in older age. Professor Ian Philip found little evidence that care on acute wards improved following the publication of the national service framework for older people.

To geriatricians the most worrying feature was its proposals for developments in intermediate care. An extra 5000intermediate care beds were to be created. Geriatricianswith long memories recalled that in the 1960s there were many intermediatecare beds outside acute hospitals, into which "bed blocking" oldpeople were transferred in the hope that somehow they would disappearfrom the system. Geriatricians then spent their livesgetting such beds closed and their staff resources transferredto acute hospitals to provide the specialist rehabilitative carethat older people needed to get safely and expeditiously home. Specialistgeriatric rehabilitation units are crucial elements of comprehensiveacute hospital services but are expensive. It might prove convenient formanagers to confuse convalescence (spontaneous recovery) withthe more expensive rehabilitation that is necessary to make non-spontaneousrecovery happen. Those geriatricians who defendedspecialist rehabilitation units might now have to fight to prevent their being downgradedto intermediate care. Indeed, managers might seek to close rehabilitationunits to free money for purchasing intermediate care beds in privatesector nursing homes. There was also a risk that older patients could besent directly to intermediate care, bypassing the skilled diagnosticevaluation that the complexities of disease and disability inold age require.

Though resources might be short within the hospital service, the problem was at least as great for the social services that were often responsible for the costs after discharge. A King's Fund report published in June 2001 stated that the sector was under-resourced. As social service departments paid for residential and nursing home care for people discharged from hospital, their financial problems affected the NHS. Private residential and nursing homes began to close because payments did not cover the cost of providing a staff intensive service, and the cost of meeting (quite properly) newly introduced standards of accommodation. The Fund believed that without substantially more expenditure on social care it would not be possible to achieve implement policies for the better care of the elderly. More and more hospital trusts reported that it was difficult to admit emergencies as it was difficult to discharge older patients requiring care;10-50 beds might be blocked in this way in an acute hospital. Government therefore put additional money into the support of residential and nursing homes and in 2002 suggested that local authorities might be made to bear the cost of patients in hospital who were waiting for a local authority social services placement.

Alzheimer's disease was among the conditions in which genetic medicine showed promise. Single and multiple genes were identified that led to the production of amyloid protein within the brain. Aberrant processing of amyloid precursor protein, leading to increased production and aggregation of amyloid peptide inthe brain, seemed central to the pathogenesis of Alzheimer'sdisease. This in turn suggested types of treatment that might in future be developed. Three drugs, cholinesterase inhibitors, for treating mild or moderate Alzheimer's disease were approved for use in the NHS by NICE in January 2001.

Mental illness

Organisational change had a substantial impact on psychiatry and community care in particular. Changing organisations and changing boundaries of, for example, the Mental Health Trusts, made the development of team work with social work services difficult. As patients were discharged into the community, some of the old mental hospitals such as Friern were converted into high quality living accommodation for the well-to-do. Developers appreciated the amazing assets these former institutions possessed. Yet care in the community as a policy had often failed. SANE, a voluntary mental health charity primarily concerned with schizophrenia, believed that while it worked for some people, it let down many others. A social experiment meant true liberation for many people who, with the new drugs, were well able to live outside hospital, but for others it had meant fighting for mental and physical survival alone in flats and bed sits, or with their families who broke down under the strain. There was a famine of community psychiatric nurses, and there needed to be enough hospital places, places of asylum in the true sense of the word.

Government agreed that there had been too many failures of the policy and promised £1 billion to develop outreach schemes, 24-hour nursed beds, other initiatives, and a review of the Mental Health Act 1983 to give health authorities wider powers to order the supervision of patients outside hospital. Less attention was placed upon the disparity between the staff available and the demand for psychiatric services. Government strategy, Modernising Mental Health Services, had two essential elements. There would be increased investment to provide more beds, outreach facilities and 24 hour access; and new treatments. Secondly there would be increased control of patients to ensure compliance with appropriate treatment in the community, and a new form of revisable detention for those with a severe personality disorder. There would, as was the vogue, be an emphasis on service frameworks, and performance monitoring.

In December 2000 a White Paper, Reforming the Mental Health Act, was published. It aimed to deal with public concern that the policy of closing mental illness hospitals, and care in the community, had led to the release of hundreds of patients some of whom did not seek or receive care and treatment, becoming a risk to themselves and others. Such releases had contributed to 1000 suicides and 40 murders a year, according to the Department of Health. There would be detention, if necessary indefinite, of those believed to be a danger to the public. Legislation would close many loopholes preventing detention even where future problems appeared inevitable. For example the criteria of "treatability" would be removed. Compulsory treatment would be possible with safeguards for the patient, and the sharing of information between the police, the social services and the health services would be encouraged. A draft Bill along these lines was published in June 2002 and attracted immediate criticism over the proposals for compulsory detention and treatment of some people with so-called dangerous severe personality disorders.

As in other clinical fields, government sponsored a National Service Framework and proposed targets including a 20% reduction of suicides. A strategy to reduce suicides among high risk groups, promote mental well-being in the general population and to research suicide and suicide prevention was published in September 2002, measures to include the improvement of the prescribing of antidepressants and analgesics, and partnerships to identify and improve safety at suicide ';hot spots' such as railway bridges. It would be implemented by the National Institute for Mental Health in England (NIMHE). The institute is led by the NHS national director for mental health Professor Louis Appleby and is part of the Modernisation Agency.

Drug treatment of psychiatric disease continued to improve with, for example, the introduction of new and 'atypical' antipsychotics that produced fewer extrapyramidal side effectsthan conventional drugs such as chlorpromazine - for example amisulpride, olanzapine, quetiapine, risperidone and zotepine. The new drugs were many times more expensive than the ones they replaced but seemed to work better, appearing to reduce the suicide rate. However the extent to which street drugs such as crack cocaine were often taken by patients with recognised psychiatric problems was an increasing hazard. The evidence for their cost-effectiveness was equivocal but in June 2002, some years after their introduction, NICE approved their use both for patients with side-effects from the traditional drugs, and as a first treatment. There was also hope that the new drugs in the pipe-line would modify the disease, rather than merely treat the symptoms. Perhaps, caught early, schizophrenia would go into full remission, or the progression of Alzheimer's disease might be alleviated.