Chronology

1998

Background

Kyoto Protocol (December 1997)

Digital TV

FTSE reaches 6000

Human Rights Act

NHS events

Green Paper – A First Class Service

Bristol cardiac surgery scandal

Information for Health strategy 

NHS Direct 

Acheson Inquiry into Inequalities in Health

da Vinci robotic equipment (USA)

1999

Background

Introduction of the Euro

First elections for Scottish Parliament and Welsh Assembly;

Irish power-sharing agreement.

NATO action in Kosovo & Serbia 

Indictment of President Clinton

Paddington train disaster

Disruption in East Timor

NHS events

National Institute for Clinical Excellence (NICE) established

Nurse shortage; substantial pay award.

Royal Commission on Long-Term Care of the Elderly

White Paper – Saving Lives: Our Healthier Nation

Abolition of fundholding

Establishment of Primary Care Groups

Alan Milburn Secretary of State

Changes in British Society

The decade opened and closed with Labour in power and the NHS in financial crisis, in spite of the greatest increase in expenditure the NHS had ever seen. The economy was sound for most of the decade. The UK, as many other countries, experienced terrorism, often fuelled by radical Islamic influences. The devastation in New York (9/11), atrocities in Spain and the London Underground, and the Iraq war cast long shadows. Following the Kyoto Protocol in 1997, climate change and carbon emissions became an international issue without significant achievement. Globalisation, the pressures of the European Community and the digital revolution were driving changes. The introduction of the Euro in 1999 led to debate on our place in Europe, and the European constitution. To bring Britain in line with the Community, ambulances changed colour from white to an eye catching yellow.

Population movement increased. First London and then the whole country experienced an influx from the European Union. Tens of thousands of young French came. Even before the EC expanded, many from Eastern Europe and especially Poland arrived, filling jobs that the indigenous population did not want. Over a decade, around a million people came. Local authorities complained of the pressure on their services. Retired English travelled to France and Spain for the quality of life. Emigration from the UK increased steadily to nearly 200,000 in 2006. Public reaction to economic migration and asylum seekers changed the political landscape throughout Europe – the UK Independence Party (UKIP) had been founded in 1993. Some migrants came from areas with a high prevalence of AIDS, tuberculosis and hepatitis B. While Bevan explicitly believed that the NHS should be available to everyone, resident or visitor, government now said that it should not be free of charge to those who did not live in the UK. Front-line staff had little time or inclination to ask patients too many questions.

How do we distinguish a visitor from anybody else? Are British citizens to carry means of identification everywhere to prove that they are not visitors? For if the sheep are to be separated from the goats both must be classified. What began as an attempt to keep the Health Service for ourselves would end by being a nuisance to everybody. Happily, this is one of those occasions when generosity and convenience march together.
(In Place of Fear, chapter 5, Bevan, 1952)

The World Health Organization’s 20-year plan to bring ‘health for all’ failed. More than 2 billion people had no basic sanitation. The European Region’s Health for All, equally ambitious, was in tatters.1 The campaign for the reduction of third world debt made only limited progress, and poverty, famine, wars and the AIDS crisis seemed worse day by day.

The north/south divide was increasing. The need to commit 24/7 to one’s employer stressed some. Crises hit agriculture – Bovine spongiform encephalopathy (BSE), foot and mouth disease in sheep. Our multi-ethnic society was increasingly apparent. Racially motivated riots (Oldham), protests against a global economy, and violence in the streets, sometimes black-on-black and against NHS staff, soured the atmosphere. The fashion for body-piercing and cropped tops changed the townscape, while pressure led to the establishment of smoke-free public places. To the profit of pharmacies, a gullible public spent increasingly on ineffective ‘alternative’ medicines, while a split in the anti-vivisection movement led to terror tactics. For the young, adventure holidays and gap years proliferated, with a rising use of recreational and synthetic drugs and clubbing. Some died as a result. Institutional and financial malpractice, threats to the pension schemes and banks (Northern Rock) accompanied financial crisis.

In 1998, Labour devolved power to an elected Parliament in Scotland and an elected assembly in Wales and Northern Ireland. Four different health services emerged. In England there was an accent on the purchaser/provider split, improving performance and setting targets; in Scotland a professionally led integrated system based on clinical networks; and in Wales, partnership between the NHS and local authorities. Both in Scotland and Wales, benefits were available for the care for the elderly, drug availability and in prescription charges that were not in England. The differences in funding under the Barnett formula were apparent.

Public spending as 2007–2008

% of GDP

Total expenditure per head

England

41.1

£7,121

Scotland

50.3

£8,623

Wales

57.4

£8,139

Northern Ireland

62.7

£9,385

Source Sunday Times, 9 March 2008

Towards a new model of NHS

In no previous decade had such a succession of Ministers, new policies, White Papers and restructurings hit the NHS. It seemed that, great though clinical advances were, the NHS was overshadowed by structural change, hospital scandals and an increasing desire to legislate and regulate deep-seated problems away. With ever-increasing speed, the pieces on the NHS chess board were moved around. Health advisors in No 10, economists, and operational research staff now played a role in shaping policies, largely accepting Virginia Bottomley’s concept of a tax-funded and largely free service but one in which provision was not necessarily in the public sector. A raft of policies emerged; not always compatible, seldom evidence-based; private sector involvement, quality, peer review, central direction, performance reporting, accountability, competition, trusts, patient choice, and payment by results.2 There was bipartisan support for many policies, such as the National Institute for Clinical Excellence (NICE), a purchaser/provider split, Foundation Trusts, concentration on long-term illnesses, patient choice, involving primary care in commissioning, a tariff system to pay providers and a more personal service. Both Parties looked at what could be learned from managed care organisations such as Kaiser Permanente whose characteristics included integration of funding with provision of service, integration of inpatient care with outpatient care and health promotion, focus on minimising hospital stays by emphasising prevention, early and swift interventions based on agreed protocols, and highly co-ordinated services outside the hospital; teaching patients how to care for themselves, emphasis on skilled nursing, and the patients’ ability to leave for another system if care was unsatisfactory. Kaiser did NOT have a purchaser/provider split.

In each decade there are concepts affecting the organisational pattern of the NHS. In the 1970s, it was consensus management. In the 1980s, the general management function. Now, spurred by scandals in the financial sector and industry, good governance became a guiding principle. In 1992, the Cadbury Report had identified principles of good governance in organisations – integrity, openness and accountability. This was taken further in the Nolan Report (1997) and absorbed into NHS management.

The private sector and the NHS

There was an increasing role for the private sector as the NHS moved from a services provider to a commissioning organisation. This was opposed by Frank Dobson (at one time Secretary of State), substantial parts of the Labour Party, unions, NHS management and sometimes the medical profession. Previously used by the NHS as a pressure release valve, the private sector was becoming integral to all segments of the NHS. Commercial organisations tendered and supplied family practitioner services, hospital trusts increasingly contracted out services, and patients might have a choice of a private hospital. The private finance initiative funded hospital building, and privately managed independent treatment centres handled NHS patients. DHL took over the supply and transport of hospital supplies. Perhaps it was not surprising that the NHS decided to brand itself. In 1999, to imply focus and consistency of service, Frank Dobson told the NHS to adopt a single logo. 

Yet the UK spent less than almost any other Western country on private health care. The number of those in the UK with private medical insurance had remained static for several years but increased again in 2000 to 5 million, about 12.6% of the population. More were insured in the south than the north and some used fixed cost ‘pay-as you go’ packages. Cataract removal for £2,000, knee replacement for £7,000 or a heart bypass for £10,000 might be a practical proposition.

Ethics and patient participation

Ethical problems abounded, particularly in genetic medicine and in-vitro fertilisation. Parliament considered issues such as the creation of ‘rescue babies’ whose stem cells could help a sibling. The General Medical Council (GMC) issued advice on consent and ethical problems; doctors must set aside their religious and other personal beliefs if these compromised the care of patients. Community Health Councils (CHCs) were replaced by the Commission for Patient and Public Involvement in Health in 2003, itself abolished in 2008 and replaced. Foundation Trusts provided members with other opportunities to participate.

Health service information systems and Internet

It was the decade of Google, Yahoo, Facebook, YouTube, Wikipedia, Blogs and Amazon. Apple gained market share; Microsoft stumbled. In 1998, 6 million in the UK had access to the web at home or at work and, within the decade, the majority had broadband access. The clinical knowledge on the web was so vast that doctors might find useful suggestions by using Google.

The US Government, the Mayo Clinic and Kaiser Permanente were early in the field. The UK was initially cautious but, by 2000, the NHS, the Department of Health and the British Medical Association had effective websites and increasingly used them to publish their documents and reports. The NHS website was re-launched as NHS Choices in June 2007 to provide patients, carers and the public with accurate and up-to-date health information.

Education benefited. The National Electronic Health Library, a resource primarily for professionals, was followed by the National Library for Health. In 1998 the British Medical Journal (BMJ) became an open access journal, making the full text freely available. In 2003, the BMJ Publishing Group provided access to the evidence-based summaries available in Clinical Evidence. Journals increasingly offered online editions, sometimes free, and Stanford University’s Hire-Wire Press hosted several hundred electronic versions of scientific journals and provided a search system. The US National Library of Medicine’s free digital archive of biomedical and life sciences journal literature – PubMed Central (PMC) – aimed to digitise a complete archive of medical journals, including the BMJ, some going back more than 125 years.

An effective NHS information system centred on clinical need was at last under development. Appropriate technology was becoming available. In 2002 the Audit Commission stressed the importance of accuracy in Data Remember: Improving the quality of patient-based information in the NHS.3 The assessment of the quality of care, and contracts that required information about who had done what for whom, and new services such as pharmacist prescribing and walk-in centres made a coherent IT system essential. An NHS Information Authority was established to manage the development of systems and oversaw the introduction of the NHS number, new numbers for babies, payments for GPs and national screening programmes. The strategy for NHS IT dated back to 1992, but recurrent problems reduced support for the programme. In 1998, a White Paper, Information for Health, created new momentum and shifted the emphasis to the clinical from the administrative.4 It committed the NHS to provide lifelong electronic health records for everyone with round-the-clock, online access to patient records and information about best clinical practice for all NHS clinicians. Every GP would be connected by 2000 – targets that were missed.

Further impetus followed a seminar in Downing Street in February 2002 and the Wanless Report in April 2002 which criticised NHS IT as “piecemeal and poorly integrated”. In July 2002, Delivering 21st century IT support for the NHS was published.5 An unprecedented investment began, some £18 billion over ten years. It was the world’s largest and most ambitious health programme, aiming to create comprehensive electronic health records to be made available to all providers.

Richard Granger was appointed National Director in 2002. Contracts stressed speed, competition and payment to contractors only if they delivered. In 2004, the Department of Health created a new body to deliver the programme renamed ’NHS Connecting for Health’. The programme was handled in a top-down fashion and was organised in two parts: a national spine; and five local providers covering five regional clusters. There was a habit of placing very large contracts, often outsourced. In retrospect, this hindered innovation and flexibility as issues changed, including the patterns of NHS organisation. In 2004, BT was awarded the contract to provide the national infrastructure and to be one of four Local Service Providers (LSPs). Three other firms won contracts to provide services in other areas. LSPs were responsible for IT systems such as GP and trust systems and would make sure local applications could share information with the national systems.

National contracts were also awarded to BT for the NHS Care Records Service, Atos Origin (formerly SchlumbergerSema) for Choose and Book, and BT for the New National Network. BT would act as a system integrator and BT Syntegra for information and payments under the Quality and Outcomes Framework. The system involved:

  1. A national network for the NHS – fast connections to all NHS sites, and an electronic referral service, ‘Choose and Book’, so patients could choose which hospital they would like to attend at a time to suit them, exercising choice. The programme fell two years behind, but by 2007/08 the system was increasingly reliable.
  2. There was also the NHS Care Records Service (NHS CRS) containing basic patient information and health details so people would be able to access their record and all their health information and be involved in making decisions about their own care and treatment. As a spin-off, data for research would be available. This fell grossly behind schedule.
  3. Electronic prescription service (EPS) – to enable prescriptions to be sent from the GP to the dispenser and then for reimbursement.
  4. Picture archiving and communications system (PACS) – an unexpected and rapid success that allowed X-rays and scans to be stored and transmitted electronically, completed in 2007.
  5. The National Electronic Library for Health (later Health Information Resources) designed with the NHS employee, doctor or nurse in mind.

The programme limped along and many hospitals had to upgrade ageing systems. GPs could continue to choose one of a range of existing systems as a national standard one was not available. Suppliers faced major losses and gave up contracts. Public anxiety about the security of personal information was increased by a series of alleged security breaches. Richard Granger left the programme in 2007, by which time the NHS spine was in place, covering basic data, name, address and NHS number, but the summary medical record and its transmission between providers remained far off. A Department review found failures at the top; no one seemed to ‘own’ the big picture on information, there was no system to translate policy into business requirements, and a shifting of responsibility for IT around the Department.

However, by 1996, 96% of general practices were computerised and about 15% ran ‘paperless’ consultations. In hospitals, computing was treated as a management overhead and doctors had few incentives to become involved. For 20 or more years GPs had used PCs; hospitals needed larger machines. The differing structure of patient records from specialty to specialty and security all made for problems.6

In 1995/96 a new NHS number was issued to all patients on GPs’ lists. These numbers were the basis or electronic patient medical records. By 2001, it provided online access to over 60 million records.

Health Service Policy

The decade saw an unparalleled level of change, organisational, clinical and financial. The ‘New Labour’ model of the internal market saw Primary Care Trusts (PCTs) selectively contracting with providers. There were several ‘reviews of the NHS’ and the succession of ‘reforms’, ‘modernisation’ and ‘reorganisation’ hardly bears repeating but included:

  1. dismantling GP fundholding and establishing practice-based commissioning
  2. abolishing district authorities and forming primary health care groups, then turning them into trusts, and later merging them into half the number
  3. demolishing regional health authorities to create 28 strategic health authorities and then merging these into ten authorities
  4. creation of Foundation Trusts
  5. the Darzi reforms.

Until the 1990s, the NHS tended to the paternalistic, with limited choice for patients. Public spending had been controlled firmly, NHS waiting lists had risen, and Kenneth Clarke and Alain Enthoven aimed for an internal market to improve allocation of resources. Purchasing and provision were separated and the aim was to give patients more choice of provider and the information to make that choice. Initially purchasers continued to enter into large bulk contracts, the accent being on activity rather than outcome. With temporary hesitations, these principles were adopted by Labour, which added a fourth element in 2003 – a better payment mechanism.

The complexity of these changes, often differing from place to place, presents a messy story hard to present coherently. Finance was initially tight, then much more money became available, and as the decade ended worldwide financial crises loomed. A plethora of policies, many individually sound, seemed to have been developed without regard to each other, did not always mesh together and produced unanticipated results. Each of the five Secretaries of State imposed (with the support of Downing Street, Tony Blair or Gordon Brown) his or her own approaches, so there were u-turns – for example, on private health care.

Frank DobsonFrank Dobson (1997–1999). Apart from the elimination of fundholding, and the substitution of ‘commissioning’ for ‘contracting’, there was little change. Labour’s ten-year agenda was set out in the The New NHS - Modern, Dependable,7 and, during Dobson’s watch, the National Institute for Clinical Excellence (NICE), a health inspectorate and national service frameworks were established.

Alan MilburnAlan Milburn (1999–2003) arrived “as the wheels were coming off” as a result of a financial crisis. Labour had assumed that, if it reversed Tory reforms and Alan Milburn smothered hospitals with affection, all would be fine. It now discovered that many Conservative reforms had merit. Milburn and his successors developed and refined the Conservative reforms, with a new and sometimes disruptive dynamism and a desire for massive change across a broad front, epitomised by his NHS Plan (2000)8. Other key documents were Shifting the Balance of Power (2001)9Delivering the NHS Plan (2002)10 and The Wanless Report.

There was central command and control, although with some attempt at devolution; small increases in funding were replaced by major additions projected over many years. Milburn recognised that the NHS was underfunded and obtained more money to expand staff training and recruitment, an increase of 250,000 over the next six years. There was a start of a major building programme of hospitals under the Private Finance Initiative. Radical changes in organisation and funding took place. The idea of Foundation Hospitals was born, and Labour sought partnership with private health care to create new capacity and to provide a challenge to complacency in the NHS. It looked to the experience of other countries such as the USA.

Simon Stevens, the health advisor to the prime minister from 2001–2004, and the intellectual force behind many of Labour’s reforms, wrote that the attempt to increase capacity, improve quality and increase responsiveness while avoiding cost inflation was based on three parallel strategies.

  1. Supporting providers by increasing their number, modernising infrastructure, supporting learning and the improvement of the system. (Capacity would be increased through staff recruitment, public-private partnership projects and new providers.)
  2. Improving efficiency and reducing variation in performance by setting standards (cost effectiveness, tariffs, contracts, National Service Frameworks, inspection, regulation, publishing performance information and direct intervention when necessary).
  3. Using market incentives for change and local accountability (for example, patient focus and choice, star ratings, reforming financial flows, competition and commissioning).

John Reid (2003–2005) was deeply committed to Bevan’s vision of a national health service, but he eased the rapid pace set by Alan Milburn. He focused more narrowly on a few key things that might be delivered. At last there were improvements in waiting lists and staffing. Foundation Hospitals were a divisive issue, but Reid continued to develop them. GPs and consultants fought against new contracts and then accepted them, gaining far more money than the Department had predicted. Reid established a review of the many ‘arm’s-length bodies’ and continued the ‘modernisation’ policies of his predecessor, placing increasing emphasis on patient choice. The new market of financial flows (Payment by Results) came into effect, and there was increasing emphasis on chronic diseases and long-term care. Key documents were A patient led NHS: Delivering the NHS improvement plan11

Patricia HewittPatricia Hewitt (2005–2007) continued these policies with progressive introduction of private sector services, moving to a more market-based approach. Her major initiative was the White Paper in 2006: Our Health, Our Care, Our Say - Community Care 12 on shifting care from hospitals to the community services. 

Alan JohnsonAlan Johnson, (2007–2009) a former union general secretary appointed by Gordon Brown, took a fresh look. As part of his Government of All Talents (GOATS) Gordon Brown selected a high-profile surgeon, Professor Lord Darzi. Johnson was particularly committed to the reduction of health inequalities and published a review of the progress and next steps in reducing them. Yet another review of the NHS explored “the causes of dissatisfaction among staff and patients”. High quality care for all 13was published in June 2008.

The Incremental Policy Developments 1998–2007

The New NHS – Modern, Dependable

When Labour came to power in 1997, Frank Dobson (and his Minister Alan Milburn) found to their dismay that, while in opposition, the party had developed no health service policy ready for implementation. They were starting from scratch. In December, Labour issued The New NHS – Modern, Dependable,14 their initial vision, conceding that some of the features of the Conservatives’ internal market were worth keeping, while denouncing many. The government wanted to get things done fast and without necessarily relying on local management bodies. Watch-dogs, systems of audit, targets, and external and retrospective methods of control proliferated, as did ‘zones’, initiatives and ‘tzars’ with a responsibility for improving specific service.

The New NHS built on: current trends; better communication within the service; GP out-of-hours services increasingly using nurses to assess emergency calls; NHS Direct – the new nurse-led help line; and an accent on quality with new national supervisory bodies. Labour established a National Institute for Clinical Excellence – now the National Institute for Health and Care Excellence (NICE) – to investigate and approve cost-effective pharmaceuticals and interventions for use in the NHS, and a Commission for Health Improvement (CHI) (later the Healthcare Commission) to see what was, in fact, happening.

The harder edges of the internal market were softened. Fundholding would go, co-operation replacing more extreme forms of competition. ’Partnership’ and ‘integration’ would replace the internal market. The jargon changed to that of New Labour; ‘seamless services’ became ‘joined-up thinking’. National guidance stressed the interdependence of health and social care, a return to the attempts by Barbara Castle and David Owen in 1974 to integrate health and social services planning. The NHS Act (1998) gave legislative authority for these changes and also the basis of professional self-regulation of the General Medical Council.

Main features:

  • New services for patients including a 24-hour, nurse-led help line
  • Connect every hospital and surgery to NHSnet  
  • NICE and Commission for Health Improvement to issue guidelines and oversee clinical quality locally
  • Replace the internal market with ‘integration’
  • Statutory duties of partnership to be placed on NHS bodies
  • 500 primary care groups of GPs (later Trusts) to take control of most of the NHS budget subsuming fundholding.

The run-up to the NHS Plan

By autumn 1999 it was clear that the NHS needed a lot more money urgently. A Mori poll showed that public satisfaction with the NHS fell substantially between 1998 and 2000 from 72% to 58%. Alan Milburn, becoming Secretary of State, reversed one of Frank Dobson’s policies and encouraged co-operation with the private sector. Following a ’Panorama’ interview with Tony Blair on 16 January 2000, extra money was found for the NHS in the March 2000 budget, on condition that the service and the professions ‘modernised’ themselves. Burdens on GPs might be reduced by NHS Direct and walk-in clinics; GPs, dentists, opticians, pharmacists and physiotherapists might group to take on more hospital work; and old people might move out of big hospitals to convalesce in smaller ones. The extra money – a higher growth rate of some 7–9 per cent – was generous.

In May 2000, the government issued millions of questionnaires and established six service reviews. Richard Branson of Virgin Airways was to advise on how to make hospitals more consumer-friendly. While he and his team made a number of suggestions about abysmal patient care, they went well beyond their remit, concluding that the NHS was being undermined by poor management and promoting an increase in the use of private companies. The report suggested short stay specialist elective units, GP “poly-clinics” and extended into areas of regional pay, consultant contracts and staff salaries, elements that would later find their way into Labour’s NHS Plan.

A seminal analysis had been published in 1999 by Professor Alain Enthoven, who had earlier assisted thinking about the NHS in the mid-eighties. Enthoven analysed the 1991 reforms.15 He saw advantages in the competition and innovation and thought there had been a slight rise in productivity, although there had been higher ‘transaction costs’. Fundholding tilted the balance of power from secondary to primary care and in some trusts improvements had resulted from increased locally responsibility for performance. However, the information about costs and quality was often not available and incentives were sometimes perverse, with patients following the money allocated contractually, instead of money following patients to the hospital where they wished or needed to be treated. He argued for far greater attention to continuous quality improvement in the NHS. Could Labour make the NHS more responsive without introducing consumer choice, competition and needing substantially more money? Fundamental reform and examination of performance variation was required. He cautioned against ‘quick fixes’ and Labour’s tendency to centralise management and policy-making. He argued that consumer choice – to which the Conservatives had been moving – was essential. Labour was listening.

The NHS Plan

Labour’s second proposals for the NHS were issued in July 2000 – the NHS Plan16 set out to achieve a diagnosis of the problems, for example,. honesty about underfunding, an identification of priorities ( increasing capacity, improving responsiveness and dealing with major killing diseases), mechanisms to achieve change and a broad coalition of interested parties. Tony Blair MP, speaking to a meeting of the New Health Network Clinician Forum on Tuesday 18 April 2006 said:

We would first build up capacity and introduce new pay and conditions for staff and set strong central targets for improvement. However, the idea was then, over time, to move to a radically different type of service, abandoning the old monolithic NHS and replacing it with one devolved and decentralised with far greater power in the hands of the patient. The idea was and is to make reform self-sustaining; so that instead of relying on the necessarily crude and blunt instruments of centralised performance management and targets, there is fundamental structural change with incentives for the system and those that work within it, to respond to changing patient demand.

Labour consulted the public and the professions, the latter becoming became deeply and often enthusiastically involved. The doctors said that any plan had to be long term, if only because of the time it took to train staff. They could understand the political need for short-term fixes but this should not detract from the longer view. The British Medical Association (BMA) liked the government’s acceptance that the NHS was underfunded and there were too few doctors and nurses. Of more than 100 proposals, only one was unacceptable to the BMA in principle (debarring young consultants from private practice) and only a handful were questioned, (for example, that the staffing problems of the NHS might be solved at the expense of the Third World). The public wanted quicker access to a GP, an end to ‘trolley waits’ in A&E, booking systems for appointments and treatment, shorter waits for inpatient surgery and better food in cleaner wards. The Times believed that it was a coherent strategy, focusing on enhancing the numbers and function of nurses, addressing the role played by consultants, and increasing the number of beds that had fallen remorselessly for two decades. There were details and targets aplenty. Initiatives varied from “bringing back matron”, to the improvement of hospital food by consulting celebrity chefs. There would be guaranteed access to an Accident Department consultation within 4 hours, and a telephone and TV beside every hospital bed. Patients would not have to wait more than three months to see a specialist, or more than a further six months to have an operation. Central pressure was exerted on local management to meet waiting-time targets. The BMJ was similarly enthusiastic saying that “this is as good as it gets – make the most of it.”17

Main features of The NHS Plan

  • More doctors, nurses and medical students by 2004
  • Consultants to commit their first seven years to the NHS
  • 7,000 more beds and 100 new hospital schemes by 2010
  • All patients to see a GP within 48 hours by 2004
  • Booking systems to replace waiting lists (later called “choose and book”)
  • A patient advocacy service for each trust, replacing CHCs
  • A UK council to co-ordinate the profession’s regulatory bodies (a reaction to perceived failures of the GMC after problems with heart surgery in Bristol and a determined attack on professional self-regulation)
  • A new level of PCT to provide closer integration of health and social services.

The NHS Plan’s aspirations were not costed and, in the event, the same money was spent on several different things storing up a future crisis, for example, pay awards above inflation and poorly negotiated contracts, reducing the hours worked by junior doctors, the recommendations of NICE, the costs of National Service Frameworks for mental illness, cancer and heart disease and the costs of establishing PCTs. The NHS Plan raised expectations to an unsustainable level. Alan Maynard, professor of health economics at York, said it contained lots of words and good intent, but that the pearls among the manure had to be teased out. Even with enhanced budgets, the new agenda could not be afforded.

In spite of initial cynicism, patient waiting times declined, partly the result of trusts buying extra capacity by paying their consultants a premium rate to handle additional cases in the evenings or weekends. There was an assumption that there would be cash enough or, at least if government was rough enough with the NHS and its management, aspirations would somehow be delivered. Few Trusts had any chance of achieving all the targets and put finance at the top of their priorities. Chief executives might be warned that it would be ‘personally dangerous’ to make a fuss. The biggest threat to the Plan’s objectives was shortage of skilled staff. In some hospitals, the staffing level on wards was at crisis point and patients were not even being washed. David Hunter wrote that the evidence from successive reorganisations since 1974 was that altering the structure and configuration of health authorities invariably resulted in unrealistic expectations, for changing the culture of an organisation required stability, costs higher than forecast with a loss of irreplaceable skills and expertise, and failure to save money. Managers were unhappy, not because of the government’s goals, or its diagnosis of the problems of the NHS, but because of the way policy was implemented, the obsession with organisational restructuring, micro-management, short-term demands, ‘must do’ edicts, and a name and shame culture. Alan Milburn drove the high-profile and politically important Plan. In a series of documents, he set out his vision of a health service; who provided the service became less important than the service provided. Within a framework of common standards, subject to common independent inspection, power would be devolved to allow local freedom to innovate and improve services. Hospitals earning more autonomy would be subject to less monitoring and inspection, have easier access to capital, and be able to establish joint venture companies.

Legislation

The changes in The NHS Plan and Shifting the balance of Power within the NHS required legislation because of the alterations to the nature and duties of health authorities. Patient advisory and liaison services (PALS) would be established to provide assistance to patients, resolving problems where possible, but helping patients when a formal complaint seemed appropriate. In September 2001, the government established a Commission for Patient and Public Involvement in Health. The impetus owed much to Professor Sir Ian Kennedy who had chaired the Bristol report into heart surgery (2001)18. The Commission had the responsibility for establishing, funding, staffing and managing a network to take over the function of the CHCs. It was a complex structure and in 2004, when ‘arm’s-length bodies’ were reviewed, the Commission’s future was questioned. It was closed in March 2008 to be replaced by Local Involvement Networks (LINks) coterminous with local authorities.

Key Points of Legislation enacted as National Health Service Reform and Health Care Professions Act 2002:19

  1. Wider role and more independence for CHI
  2. CHCs axed: hospital-based PALS, patients’ forums and a national Commission for Patient and Public Involvement in Health
  3. Council for the Regulation of Healthcare Professionals
  4. Strategic Health authorities to be set up; old health authority powers devolved to PCTs
  5. Changes to prison service health care.

Delivering the NHS Plan (April 2002)

After the 2002 budget had increased funding, Alan Milburn, published Delivering the NHS Plan – next steps on investment, next steps on reform.20 This introduced important new ideas such as a change in the pattern of financial flows in the NHS moving to payment by results (PbR) using a tariff system. Health Resource Groups would establish a standard tariff on a regional basis for the same treatment regardless of provider.

  • Foundation Hospital Trusts would be identified. They would be established as independent public interest companies, outside Whitehall control, and governed only by performance contracts and inspection by the Healthcare Commission. They would have greater freedom of decision-making.
  • Patient choice would be encouraged. Patients would be given information on alternative providers and would be able to switch hospitals to have shorter waits. Patients who had waited more than six months would be offered services at an alternative hospital.
  • PCTs would be free to purchase care from the most appropriate provider, public, private or voluntary.
  • A new Commission for Healthcare Audit and Inspection (CHAI) – this Healthcare Commission would be created by legislation, taking over the responsibilities of CHI, health audit responsibilities of the Audit Commission and the National Care Standards Commission, a body concerned with the private sector that had only been in operation for three week.

The wheel had turned full circle within a decade and was returning to something like the Conservatives’ market reforms. Alain Enthoven described the plan as a bold wide-open market, more radical than the previous Tory version of an internal market system. Kenneth Clarke agreed that it was the internal market re-written and oriented to patient choice and devolution. Clarke’s reforms had faced a barrage of criticism from medical organisations; now there was little protest.

Modernisation and the Modernisation Agency

Modernisation became the mantra. Many of its concepts had a transatlantic origin, and the new NHS Modernisation Agency worked on projects with the Institute for Healthcare Improvement in Boston. Changes in skills mix, including the use of nurses for triage and to replace medical staff, reflected the development in the USA of nurse practitioners in the 1980s. Treatment centres were similar to US ambulatory care centres. Health Resource Groups were akin to Diagnostic Related Groups. Even national service frameworks owed much to the US guideline and health care pathway movement. The enthusiasm of those outside government who had been involved in the NHS Plan’s construction was channelled into the Modernisation Agency and its taskforces to encourage transformation, change, improvement and innovation. Doctors might be antagonistic, partly because of the fear of the increasing power of management, and the acceptance that all clinical decisions had resource consequences. There was a need to balance clinical decisions with accountability and accept the power-sharing implications of teamwork.21

The NHS Modernisation Board included several Trust chief executives, a professor of surgery (Professor Ara Darzi), a senior nurse, and board members of the Alzheimer’s Society, the Citizens Advice Bureaux and the Commission for Racial Equality. The Modernisation Agency – part of the Department of Health – was established to drive change and grew like Topsy, attracting quality staff from trusts and authorities. It was about performance improvement. The Agency became involved in helping failing trusts, running some 60 different programmes. By 2004 it employed 760 people and had a budget of £230 million, but the Department decided to reign it in. Lessons were distilled into ten high-impact changes.22 For example, trusts should treat day surgery as the norm for elective surgery and avoid unnecessary follow-ups. The bulk of its work was devolved and the Modernisation Agency was closed in 2005, parts being integrated into a new NHS Institute for Improvement and Innovation.

The NHS Improvement Plan

In 2004, John Reid published the NHS Improvement Plan,23 four years after the publication of the NHS Plan itself. This stressed the importance of the care of chronic diseases and of public health. It described a vision for the future of patient choice of provider, a reduction in waiting times to a maximum of 18 weeks by 2008, maximum wait of eight weeks for referral to treatment for cancer patients by 2005, and delivering more care, more quickly through investment and reform, offering people more personalised care and a greater degree of choice, and greater concentration on prevention rather than cure.

Labour had not traditionally favoured choice in public services, although Alan Milburn had felt patient choice important and the ability of people to choose where they might be treated, and how, might improve the system.24 In December 2003, the government published a strategy paper Building on the Best: Choice, Responsiveness and Equity in the NHS.25

From autumn 2004, patients waiting more than six months for elective surgery were offered the choice of faster treatment in alternative hospitals. Some PCTs established referral management centres to influence and control patient referrals, predominantly those by GPs, either directly or indirectly, and to manage demand so that it stayed within financial limits. In general such schemes saved little money and might delay or reduce the quality of patient care. The alternative providers were often in the independent sector or new independent treatment centres or trusts with spare capacity. A Patient-led NHS, published in March 2005, allowed independent providers such as BUPA to be included on the list of choices. But if money flowed into private hospitals, there was a substantial threat to the budget of NHS ones. Competition was being encouraged.

Arm’s-length bodies

In October 2003, John Reid decided to review NHS arm’s-length bodies to save £500 million in staff costs. The number had risen substantially since the ‘Quango hunt’ of the 1980s (review of the quasi-autonomous non-governmental organisations). Education and training, regulation, and service/back office functions, were handled by 42 bodies employing ten times the number in the slimmed-down Department of Health. In July 2004, the results of the review were published; some bodies were to be abolished (for example, the Commission for Patient and Public Involvement in Health, only established a year or so previously.) Others to be combined, (for example, the National Clinical Assessment Authority with the National Patient Safety Agency; and the Health Development Agency with the National Institute for Clinical Excellence).

Structural reorganisation (2002) ‘Devolution Day’ – 1 April 2002

Structural change was continuous. New organisations were formed, functions were redistributed, and soon they might be merged with others or abolished. The New NHS – Modern, Dependable began this process. The eight regional offices of the Department of Health’s Management Executive, said to be central to the system, were made redundant.

GP fund-holding was replaced by other methods of giving primary care influence over the hospital sector, Primary Care Groups and later PCTs. Like a pack of cards, other organisations had to change to fit in. As PCTs were given control over expenditure, the function of Health Authorities diminished and they were abolished. The organisational structure unwound until change affected every level – from the GP to the Secretary of State. Mr Milburn said that the NHS seemed top heavy, with the NHS Executive, eight regional offices, 99 health authorities and confused lines of reporting. Power would move to the front line.

The regional offices were reduced in number to four, as part of the Department of Health, and co-located with other government regional functions. The Health and Social Care Act (2001) allowed the Secretary of State to permit companies to provide services formerly provided by the NHS, and to employ doctors, nurses and other clinical staff. It also made possible a new form of Trust – Care Trusts – to provide closer integration of health and social services. Broadening the range of options for the delivery of integrated care, they could levy charges, in particular for ‘personal care’. Four new care trusts, in Northumberland, Bradford, Manchester and Camden & Islington, united mental health trusts and social care, but few were created.

On Devolution Day, major structural change took place. Some 20,000 people were affected as authorities merged, disappeared or were re-formed. Responsibilities were reallocated and the absence of clear guidance gave an impression of making things up as one went along.

The Department of Health

From 1985, when the Department of Health accepted the Griffiths Letter and created a management cadre within the NHS, it changed its own structure, dividing into an NHS Management Executive, while ‘wider’ departmental functions – for example, international health – remained within the remit of the Permanent Secretary. Increasingly, the Management Executive was staffed by people with managerial skills from the NHS or outside it, as opposed to career civil servants. The relocation of the Management Executive to Leeds in 1992/93 increased this, and progressively the running of the NHS came to seem the most important function of the Department – and one requiring great and continuing political influence.26In 2000, the top jobs of permanent secretary and chief executive of the NHS Executive were re-combined. The Department was now far smaller than previously, focused on delivering political objectives, and perhaps weaker on policy research capacity. The latter role would frequently be filled by political advisers, often brilliant but with a particular agenda. Rudolf Klein wrote that, “As of May 2006, only one of the top 32 officials in the DH [Department of Health] was a career civil servant, whereas 18 came from the NHS and six from the private sector. The shift has been from those who saw their role as being to save ministers from themselves, to those who saw it as being to deliver results. If the pathology of the former approach was conservative obstructionism, that of the latter was a readiness to run with even the silliest ministerial initiative.”27 The Department would set strategic direction, distribute resources and determine standards, ensure integrity of the system through information systems, staff training and support for development, develop values for the NHS through education, training and policy development, and secure accountability for funding and performance, including reports to Parliament.

Four new regional Directorates of Health and Social Care (DsHSCs) replaced the eight regional offices. The directorates – North, South, Midlands and East, and London – did not map the boundaries of the previous eight regional offices and, when they had only been in existence for nine months, the Department of Health reviewed functions to shrink its staff and move jobs away from London. Regional directorates disappeared, their work was redistributed to the 28 special health authorities (SHAs) or to new organisations, such as CHAI and the Health Protection Agency, that were being established.

Strategic health authorities (SHAs)

28 SHAs were created, taking some of the work of the erstwhile Regional Offices. They would “develop a coherent strategic framework. agree annual performance agreements and performance management agreements, build capacity and support performance improvement.”

They also replaced the 96 remaining Health Authorities and managed the local NHS translating national policy into local strategy. The CEOs were, as a group, board brush rather than detail people – charismatic, networking, political, and with a clear view of what they wished to achieve. They constructed plans, annual performance and delivery agreements and were not involved in operational management or revenue allocations. They shifted from being part of the provider system to regulation, to ensure that the recommendations of bodies such as CHI were acted upon. They ‘performance managed’ PCTs and NHS Trusts through local accountability agreements and prioritised major capital plans. SHAs related to between five and 19 PCTs. In London there were five SHAs, not unlike the inner parts of the old Regional Health Authorities (for the shire counties had been removed) reflecting the five-sector scheme of Turnberg.

SHAs – 2002

Avon, Gloucestershire and Wiltshire

Bedfordshire and Hertfordshire

Birmingham and the Black Country 

Cheshire and Merseyside

County Durham and Tees Valley 

Coventry, Warwickshire, Herefordshire and Worcestershire

Cumbria and Lancashire 

Essex

Greater Manchester 

Hampshire and Isle of Wight

Kent and Medway

Leicestershire, Northamptonshire and Rutland

Norfolk, Suffolk and Cambridgeshire 

North and East Yorkshire and Northern Lincolnshire

North Central London

North East London

North West London

Northumberland, Tyne and Wear

Shropshire and Staffordshire

Somerset and Dorset

South East London 

South West London

South West Peninsula

South Yorkshire

Surrey and Sussex

Thames Valley

Trent 

West Yorkshire

SHAs could associate to discharge functions better fulfilled together. The five London SHAs did so, dividing certain responsibilities – for example, children’s services, or the Ambulance service – between themselves.

Primary Care Groups and Trusts

The management of primary care had changed little over the years. Now there were radical and progressive alterations. Confusingly named as ‘Primary Care’ they had expanding responsibilities that spread into commissioning most hospital care. Labour had abolished fundholding, making the formation of primary care groups a centrepiece of its reforms. The knock-on effect on the rest of the NHS structure was only slowly appreciated. Money was increasingly disbursed through primary care groups and trusts. Only a minority of NHS managers had experience in primary care – most had gravitated to the hospital service. In April 1999, Family Health Services Authorities (FHSAs) disappeared, 481 Primary Care Groups (PCGs) were established in England, and fundholding ended.

GPs were brought organisationally together with community nurses within PCGs to integrate GPs, community health and social services. PCGs were a step for GPs into a corporate world. They had complex functions, including the provision and commissioning of care, and a lead role in improving health, reducing inequalities, managing a unified budget for the health care of their registered populations, improving quality, and integrating services through closer partnerships.

PCGs ran for a while in parallel with their health authorities while evolving, often by merger, to become PCTs with wider clinical and financial functions. The number of health authorities fell, driven by a progressive reduction in their responsibility for commissioning services. By 2002, there were 302 PCTs, each covering populations averaging about 170,000. Most PCT boundaries were set with coterminosity in mind, matching the boundaries with those of local authorities. In London there was always a match with local authority boundaries.

The advantages of being big – managing risk and economies of scale – clashed with the advantages of being small, adaptable to local needs, and being close to primary care. As Trusts grew bigger, their discussions were increasingly concerned with broad planning issues (for example, the commissioning of complex supra-regional hospital services), and less in details of individual practices and patients. PCTs were very expensive organisations and many merged for this reduced the transaction costs of contracting. When the first chief executives were recruited, there was no knowledge of the major role expansion about to happen – responsibility for the bulk of NHS funding. In April 2003, allocations were made directly to PCTs and the health authorities were wrapped into SHAs.

Four elements were used to set PCTs’ actual allocations:

  • weighted capitation targets – a formula based on the age distribution of the population, additional need and unavoidable geographical variations in the cost of providing services
  • recurrent baselines – representing the actual current allocation that PCTs receive
  • Distance from target
  • Pace of change policy – the speed of change was decided by Ministers for each allocation round.

PCTs placed an emphasis on planning. Service Level Agreements were succeeded by Joint Specific Needs Assessments on the basis of which contracting was organised. They had to answer the questions “What did an area need? What did the PCT want to buy? And what was available locally?” PCTs had to develop new and commercial commissioning skills, and it was important for the PCTs to work with providers, wherever possible, to ensure that nobody had a nasty surprise. No more than 10% of services were commissioned regionally or nationally (because they were highly specialised), and GPs were involved through practice-based commissioning, in which they had the right to advise the PCT on the services required. In 2006 it was announced that the number of PCTs would be reduced to 152 from October that year, making them larger and more strategic in nature, saving money and potentially strengthening their commissioning functions.

NHS Trusts

Hospital trusts were least affected by devolution day. Their lines of accountability changed repeatedly as the organisations around them shifted their functions, to regional offices and later to SHAs for their statutory duties, and to health authorities and later PCTs for the services they delivered. The number of Trusts fell through merger; 22 Trusts merged in 1998, and a further 49 in 1999.

NHS Foundation Trusts

The concept of Foundation Trusts is said to have emerged in 2001 when Alan Milburn visited a Madrid hospital that was freed from detailed bureaucratic control and able to borrow money from big banks, rather than using funds under tight public control. Two central ideas were: a new form of social ownership with services owned by and accountable to local people rather than to central government; and decentralisation and devolution. The concept was trialled in a speech to the New Health Network in January 2002, and several Trusts expressed an interest in piloting the proposals. In July 2002, acute hospital trusts were told they could apply to be Foundation Trusts. Legislation was necessary and details appeared in December in the circular The Guide to NHS Foundation Trusts.28 Each Trust would have a board of governors representing the interests of patients, staff, local partner organisations, local authorities and the local community. The Secretary of State for Health would not have the power to direct, nor be involved in appointing their board members. The Trust’s management board would be accountable to the governors, who would elect the chair and non-executive directors. It was a complex model – perhaps over complex – and not entirely to the liking of some managers. They had greater financial and managerial autonomy, including freedom to retain surplus finances, to invest in delivery of new services, and the flexibility to manage and reward their staff.

NHS Accountability

The idea split the Labour Party. Some MPs feared that foundation status would fragment the NHS and create a two-tier system in which the best hospitals could get more cash and poach staff, that it would denationalise the NHS and allow back-door privatisation. All parties had objections and Trusts’ freedom was progressively constrained. Their borrowing would be on the government’s balance sheet, and pay and conditions of service would be within The Agenda for Change, a national personnel policy. They would be accountable (through contracts) to PCTs. There would be an independent regulator (Monitor) to supervise them and decide what services should be provided and, if necessary, dissolve Trusts. There would be safeguards to prevent the sale of hospitals or their assets, and limit the extent to which Foundation Trusts could undertake private practice – a huge problem for some hospitals such as Great Ormond Street that had a massive international practice. Nevertheless, Foundation Trusts would be able to redevelop and re-equip themselves more easily and carry over surplus money year on year. 

In March 2003, the Health and Social Care (Community Health and Standards) Bill was introduced. John Reid took the Bill through the House and many Labour MPs voted against it, and it was the votes of Scottish MPs, whose constituencies were unaffected by the legislation, which saved the government when the Bill first passed the Commons in July 2003. It was passed acrimoniously from the Commons to the Lords and back again. Foundation Trusts remained divisive. To the proponents, they would set the NHS free from the yoke of central government. To opponents, they were a back-door privatisation that would destabilise the NHS and introduce a two-tier service. Some claimed that they were in the teeth of Bevan’s vision for the NHS and destroyed concepts of equity and universality. Others believed that a varying quality of service from place to place was inevitable within such an immense health care system, that patient choice was required, and that more freedom encouraged development and improvement of the NHS to the benefit of all. The Bill eventually passed in November 2003.

NHS Foundation Trusts differed from existing NHS Trusts in three key ways: the freedom to decide at a local level how to meet their obligations; a constitution that made them accountable to local people, who could become members and governors; and authorisation, monitoring and regulation by Monitor. The Council of Governors, separate from the board of directors, was chaired by the Trust chair. Governors were elected by staff, patients and the public, along with representatives from the local PCT, (university) and local authority. Not responsible for the day-to-day management of the organisation, budgets, pay or other operational matters, they appointed the chair and non-executive directors and determined their pay.

Monitor, an independent regulatory body, was appointed under the Health and Social Care (Community Health and Standards) Act 2003 to assess, authorise and regulate Foundation Trusts. Chaired by Bill Moyes, previously the Director-General of the British Retail Consortium, it considered applicants. One of the first Foundation Trusts, Bradford Teaching Hospitals, moved rapidly into a large and unpredicted deficit. Monitor called in auditors and replaced the chairman and management team. Subsequent waves were delayed to ensure that PbR was taken into account. Monitor introduced systems of assessing Foundation Trust performance, their governance, the provision of mandatory services and financial performance.

The first wave of ten Trusts authorised on 1 April 2004

Basildon and Thurrock University Hospitals 

Bradford Teaching Hospitals

Countess of Chester 

Doncaster and Bassetlaw Hospitals

Homerton University Hospital

Moorfields Eye Hospital

Peterborough and Stamford Hospitals

Royal Devon and Exeter

Royal Devon and Exeter

The Royal Marsden

Stockport 

In July 2005, the Healthcare Commission submitted its report on the first 20 Trusts. They had: increased the ability to plan and develop new services and relate to their local populations; used their financial freedoms for capital investment and improved services, for example, offering specialist services in the community; increased local public and patient involvement through governor membership; and maintained standards of care in terms of access to and quality of care and positive relationships with local commissioners and other local providers. They had not destabilised local health services by using unfair competition to attract staff; nor ‘cherry picked’ patients who were easier to treat; had continued to invest in staff education and training; and mostly worked in partnership with other NHS services and organisations in the local health community. By the 60th anniversary of the NHS in 2008, there were 103 Foundation Trusts. Some were adopting stratagems to increase their critical mass by associating with other hospitals, or, in the case of specialised Trusts such as Moorfields (eyes) and the Marsden (cancer), developing satellite units off-site. Mental health trusts, in particular, used their new freedoms to good effect.

Our Health, Our Care, Our Say

Published by Patricia Hewitt in January 2006, Our Health, Our Care, Our Say29 proposed a shift of resources from hospitals into the community. Community hospitals in areas of high population – perhaps with a different functional content and a range of clinical specialties – would be encouraged. Major hospital development should be reviewed, and 5 per cent of health resources should be shifted from hospital to community services over the next ten years. The White Paper envisaged bringing some specialties from the hospital nearer to people, for example, dermatology, ear, nose and throat (ENT), orthopaedics and gynaecology, and encouraging community hospitals that provide diagnostics, minor surgery, outpatient facilities and access to social services in one location.

A progress report in October 2006 discussed demonstration projects, GPs who were trained surgeons operating on hernias in upgraded surgery facilities, specialist nurses from hospital following up women who had been discharged early after mastectomy, and GPs with specialist interest seeing outpatients in place of consultants. The projects were worthwhile, but many required investment in premises or staff training, and did not seem likely to revolutionise health care or save much money.

Structural reorganisation 2006

Labour’s election manifesto in 2005 made a commitment to reduce management costs in the NHS by £250 million. This required a reduction in the number of organisations. Following the election, a further wave of organisational change began: Creating a Patient-led NHS 30 had promised to move money from management to the front line. There would be a reduction in the number of SHAs, PCTs and Ambulance Trusts. In December 2005, Patricia Hewitt published Healthcare Reform in England: Update and Next Steps31 and, in April 2006, announced a reduction of SHAs to ten. Coterminosity with Government Office of the Regions’ boundaries was almost complete. The role of the new SHAs, established in July 2006, was to develop plans for improving health services in their local area, make sure local health services were of a high quality and were performing well, increase the capacity of local health services – so they could provide more services and make sure national priorities (for example, programmes for improving cancer services) were integrated into local health service plans.

NHS Structure

The Strategic Health Authorities, July 2006

East Midlands

South Central

East of England

South East Coast

London 

South West

North East 

West Midlands

North West

Yorkshire and Humber

SHAs consulted on the reduction of the number of PCTs to cut management costs and the new PCTs were established from 1 October 2006.

Map of old and new SHAs

London's changing population - graph

Source: Audit Commission 2008 – Is the Treatment Working?32

The Darzi developments

The Darzi Reports

Professor Ara Darzi had worked at the Central Middlesex Hospital in the 1990s when it was developing a groundbreaking ambulatory care unit and had joined the NHS Modernisation Board. He was used as a flying ambassador to look at local health problems, became chair of the London Modernisation Committee and a trusted adviser on health issues to Labour. He was knighted in 2002 for services to medicine. In 2006, Sir Ara was asked by David Nicholson, Chief Executive of the NHS London (the London SHA) to review services, which appeared as A Framework for Action.33 One of its fundamental proposals was the establishment of polyclinics to integrate primary care, support services and specialist outpatient services. Wishing to involve clinicians, he hand-picked many for clinical working groups and received support both from the SHA and the management consultants, McKinsey’s.

Around the time he became Prime Minister in June 2007, Gordon Brown attended a presentation on health issues by Ara Darzi, and when Brown selected his Ministers, Darzi was one of the GOATS, outside people to take part in a “government of all talents”, and became a life peer in the House of Lords. His influence increased rapidly and it was rumoured he had the PM’s mobile phone number. His views and reports came to influence health service policy, hospital system restructuring and issues of quality. Darzi, a clinician, was driven by quality. He said: “I believe what I would like to be said is that I focused our minds on what matters most, with quality being the organising principle of any health care system. It is quality that wakes me up in the morning to come to work; it is quality that my patients expect from me.” 

Lord Darzi of Denham took charge of a national review established by Alan Johnson when Secretary of State. The timescale was rapid – less than a year. Many SHAs were already reviewing their services. In October 2007, when a snap election was being discussed, Lord Darzi published an interim report setting out a ten-year vision, Our NHS, Our Future.34 Its principles should be:

  • Fairness – equally available to all, taking full account of personal circumstances and diversity
  • Personalised service – tailored to the needs and wants of each individual, especially the most vulnerable and those in greatest need, providing access to services at the time and place of their choice
  • Effectiveness – focused on delivering outcomes for patients that are among the best in the world
  • Safety – an NHS as safe as it possibly can be, giving patients and the public the confidence they need in the care they receive.

Darzi proposed a new distribution of services in primary and hospital care, including polyclinics, local hospitals, major acute hospitals, elective centres, specialist hospitals and academic health science centres. NHS London, the London SHA, established an agency, Healthcare for London, that consulted on the Darzi report and the London PCTs accepted its thrust. To allay misgivings, Darzi published a report, Leading Local Change35 that said that change would always be for patients’ benefit, clinically driven, locally led, subject to local comment, and that existing services would not be withdrawn until better ones were available. Nevertheless, there was the impetus, at least in London, to push polyclinics forward amid increasing controversy. Most seemed likely to be upgrades of units already in existence. Darzi’s work on London began to have national repercussions as he moved into government.

A NHS University

Labour’s 2001 election manifesto pledged to create an NHS University to assist in-house education and training of all staff. There was some hostility from existing educational bodies; medical schools and nursing departments had no wish to lose students to such an organisation. Established in December 2003 as a special health authority, it never developed a clear role. In December 2004, it was announced that it would merge with segments of the Modernisation Agency and the NHS Leadership Centre as an NHS Institute for Improvement and Innovation, assuming a role in the implementation and delivery of change in the NHS, and was established in 2005 as an England-only Special Health Authority, located in the campus at the University of Warwick.

Research strategy

Britain might be falling behind in research and its translation into clinical practice, inthe face of major centres in the USA, the west coast and Boston, let alone China and other countries. Imperial College, with its vast educational and clinical resources, planned the establishment of a biomedical research centre (BRC). In 2006/07 the Hammersmith Hospitals NHS Trust and St Mary’s NHS Trust integrated with Imperial College London, creating Imperial College Healthcare NHS Trust, the UK’s largest.

Following the report of the House of Lords Select Committee on Science and Technology (Chair Lord Walton), a review by Sir David Cooksey (2006) and one by Professor Anthony Culyer in 1997 on UK health care research, all NHS Research and Development budgets were brought into a single funding stream. Professor Dame Sally Davies, the Department of Health’s Director of Research and Development and Chief Scientific Adviser, consulted on how a research strategy should be implemented within the NHS. Best Research for Best Health,36the government strategy published in January 2006, set the goals for research and development over five years and commitment to “creating a vibrant research environment that contributes to the health and wealth of England.” They were to improve the nation’s health and increase the nation’s wealth and develop a research system that focuses on quality, transparency and value for money.

The report was followed by the establishment of the National Institute for Health Research (NIHR) and a move to a more open funding system, including international assessment of biomedical research centres to be supported at a national level. A panel of international experts chose centres in open competition as internationally excellent in research. In December 2006, the Secretary of State announced the selection of five comprehensive Biomedical Research Centres to be supported on a national basis – three in London (Kings, UCL and Imperial) plus Oxford and Cambridge, and a further six in particular clinical fields. UCL with Great Ormond Street, Moorfields Eye Hospital, The Royal Free and University College London Hospitals came together as UCL Partners. In London, selection as a centre was a guarantee that restructuring of the service would take research excellence into consideration.

Independence for the NHS?

Should the NHS have more independence from government? The clinching argument had always been that, as the NHS was funded almost entirely from taxpayers’ money, parliamentary accountability was essential. Tony Blair believed that independence would have major disadvantages, but others demurred, including the King’s Fund and the BMA. The Conservatives also called for an independent board to run the NHS and extension of the freedom of Foundation Trusts. A report for Nuffield37 outlined options – for example, a modernised NHS Executive within the Department of Health to separate policy from delivery, a commissioning authority, modelled on the Higher Education Funding Council for England and operating as a non-departmental public body at arm’s length from ministers or a corporation – a fully managed national service on the BBC model comprising all publicly owned assets, including Foundation Trusts.

Finance

Parliamentary briefing – NHS Funding and Expenditure

Financial problems became a central issue and theprime ministers, Tony Blair and his successor Gordon Brown, were deeply involved. How much money should come to the NHS and how was it best distributed? A more commercial framework was introduced, particularly in the case of Foundation Trusts. Financial tricks that had enabled authorities to conceal deficits, such as the transfer of capital to revenue, or borrowing money from other Trusts, ceased. The system became more transparent. 38 The funds for the health service are the result of annual or bi-annual negotiation between the Department of Health and the Treasury. The NHS was pressured by pay rises, the pay structure of all NHS staff which was being ‘modernised’, new patterns of service, (for example, out-of-hours cover in primary care that raised costs), and a shortage of nurses and doctors because too few had been trained, making it hard to use additional money effectively. Capacity bought in from the private sector was at a high cost – costs of drugs and medical technology for NICE’s recommendations – creating an additional pressure, as was the rising mean age of the population.

Clive Smee, for many years Chief Economist at the Department of Health, said the arguments used to justify more money for the NHS changed over the years. In the 1980s, the changing age structure was held to require an additional 1% a year for the health service merely to stand still. Inflation in the NHS seemed to run at a higher rate than in the economy more generally. In the 1990s, more weight was placed upon service trends, a proxy for public expectations. From the NHS Plan onwards, better information about the performance of the NHS, particularly the poorer outlook of those suffering from cancer in England as compared with Europe, became significant. It became necessary to ‘catch up’ after years of low investment, to provide safe, high-quality treatment, faster access and clean, comfortable accommodation. The costs of medical advance came to the fore with NICE and National Service Frameworks that could be costed with some accuracy. Finally, international comparisons of health service costs achieved greater importance.

Since 1976, money is allocated as fairly as possible between the different regions in England. Equity between England and the other three parts of the United Kingdom was a political matter governed by the Barnett formula. The allocation formula, which took account of mortality and other factors affecting health care costs, was an improvement on the historic allocations that had gone before, but was regularly criticised and modified. Even when weighted by population characteristics, allocations varied substantially from area to area by a ratio of 3:2.

Labour stuck to the Conservatives’ spending plans for the NHS and money was increasingly tight. Only small additional sums were available. Labour had a disarming tendency to announce the extra money available over three years, apparently tripling its generosity. The ‘bad winter’ in 1999/2000 was crucial in revealing just how serious the gap between service and demand was, and that Labour’s first two years had not delivered substantive improvement. The sad death of a patient, Mavis Skeet, a pensioner whose surgery was cancelled four times until her cancer became inoperable, added pressure. On 16 January 2000, Tony Blair told a startled Sir David Frost on ‘Breakfast with Frost’: “If this July when we work out the next three-year [spending] period we can carry on getting real terms rises in the health service of about 5 per cent, then at the end of that five years we will be in a position when our health service spending comes up to the average of the European Union.”39 The Prime Minister had not even discussed this with the Chancellor of the Exchequer, Gordon Brown, who was livid with anger at having been bounced into a major spending commitment. Neither did the Department of Health know what was coming. After Tony Blair’s statement – known as the ‘most expensive breakfast in history’ – the government accepted that major injections of money were needed and provided them. Labour kept its promise.

Worldwide, health care systems were under comparable pressures. In the US there was acceleration in growth of expenditure, in part because of the slower growth in managed care enrolment and a movement towards less-restrictive forms of managed care driven by consumer pressure. Some thought that managed care was an experiment shown to have failed because the United States (2007/08) spent about 16 per cent of its annual gross domestic product (GDP), or $6,400 per head.

The Wanless Review

When government forms a committee it is common to select people likely to make the recommendations that it wishes to see. The Treasury commissioned Derek Wanless, past chief executive of NatWest Bank and a non-executive director of Northern Rock Bank, in perhaps the most expensive report ever requested, to estimate the resources required to run the NHS in 20 years’ time. To assess what was needed to provide good standards of care was a new departure. From Guillebaud (1956) onwards it had been thought that there was no way of defining an acceptable standard of health service. Financial reports were concerned with getting the best service from the resources available, not what should be spent to achieve defined objectives. Wanless lead a team based within the Treasury, which worked closely with the No. 10 Policy Unit and the Department of Health, so whatever the report was, it was not independent. It tried to assess the service costs of, for example, National Service Frameworks. An interim report, for consultation, appeared in November 2001. The final one, Securing Our Future Health: Taking a Long-Term View40 came out in April 2002. Wanless said that there was no evidence that an alternative funding system would deliver a given quality of health care at a lower cost to the economy, the current method being both fair and efficient. He pointed to the low level of health expenditure in the UK, believed this to be related to our poorer health outcomes than some other European countries, and questioned the assumption that the pursuit of quality would be cost-free. An extra billion pounds was rapidly allocated by government. There was debate about other methods of funding the NHS, but government rejected alternatives such as social insurance. Derek Wanless said that the country needed to devote a significantly larger share of national income to health care. But money on its own was not enough – it was essential that resources were efficiently and effectively used. The report saw a wider role for NICE, and an extension of the National Service Frameworks to cover a wider range of diseases. It set out projections of resources required over the next 20 years, outlining three future scenarios: an optimistic one – the money was wisely and productively used and people demanded better services but learned to look after their own health better; a pessimistic one where people were less involved with health issues and the NHS remained unresponsive; and a middle course in which there was solid progress but not all the desirable changes occurred. The projections showed the UK spending between 10.6 and 12.5 per cent of GDP on health care by 2022–23, compared to 7.7 per cent in 2002. The average annual real terms growth rate in UK NHS spending would need to be between 4.2 and 5.1 per cent over the 20-year period, showing the highest growth in spending in the early part of the review period – an average of between 7.1 and 7.3 per cent a year in real terms over the first five years to allow the NHS to ‘catch up’ to standards elsewhere and to create the capacity essential to expand choice in future. Investment in new hospitals and more doctors was long term in its nature, and would not produce rapid improvements. Subsequently, Wanless was asked by the Treasury to provide an update on the problems of achieving the optimistic (fully engaged) scenario. A report on population health trends appeared in December 2003, and another report in February 2004. One of his recommendations was that, after five years there should be a review of how additional money had been spent. The government did not do so but the King’s Fund did.

In spite of the increasing funds, the financial state of the NHS worsened. The targets and the cash had not been matched one with the other and, in January 2002, the South-East Regional Chief Executive said it was necessary to eliminate an overspend of £60 million by the end of the financial year. In January 2002, a House of Commons Health Committee Inquiry heard that, in Croydon, implementing the government’s top 20 priorities would cost £70 million, at least ten times the available budget. In Lambeth, Southwark and Lewisham Health Authority, NICE guidance would cost £15 million to implement, depriving patients of other equally effective new treatments, such as new anti-rheumatic drugs. Some policies would have to take a back seat in the attempt to meet the government’s other top priorities. 41

On TV in February 2002, the Prime Minister, Mr Blair, said money would need to be found. In March, the Labour Party issued a consultation document, Improving Health and Social Care, to explore the problems to be faced. Underfunding of the NHS over a number of decades was conceded; it was the size of the underfunding, rather than the method of funding, that was to blame.

In his April 2002 budget, Gordon Brown provided a huge increase in NHS funding over the next five years. There would be year-on-year rises in UK spending from £65.4 billion in 2002 to £100.6 billion in 2007, 7.4 per cent in real terms annually and slightly above the Wanless proposals. National insurance contributions were raised by 1 per cent to find the money (immediately costing the NHS an additional £200 million as it was a large employer.) Would the massive injection of money achieve results? Beverly Malone, the Chief Executive of the Royal College of Nursing (RCN), looked for a big pay settlement: “the money is there and the success of modernising the NHS is riding on the shoulders of the nurses”. Other unions made similar noises. In November 2002, negotiators agreed on the Agenda for Change, a package of proposals for a new pay system covering all staff, save the most senior managers and those covered by the Doctors and Dentists Pay Review Body. There would be a minimum NHS wage of £10,100 per year and a new starting salary of £17,000 for newly-qualified nurses and other health professionals. At local (PCT) level there seemed little improvement. Pay rises, changes in junior doctors’ hours, and rising drug costs immediately swallowed much of the extra money available. David Nicholson subsequently said that “extra money had, in a sense, allowed us to subsidise poor care when we shouldn’t have done”. 42

In December 2003, a further report Securing Good Health for the Whole Population,43 commissioned by the Treasury six months prior to Derek Wanless was released with a minimum of publicity. Setting out the public health challenges to be faced if the Wanless “fully engaged” scenario was to be met, it painted a picture of a country that compared unfavourably with other major Western countries in terms of mortality and morbidity from cancer and heart disease. Derek Wanless also found wide variations in life expectancy across the socio-economic groups. By April 2006, Wanless, now no longer working within the Treasury, had come to criticise the extent to which the money made available to the NHS had been spent on salaries, rather than on long-term measures to boost health promotion and public health.

In 2021, nearly two decades after the publication of Securing our Future Health: Taking a Long-Term View, The Health Foundation published their analysis of the report’s impact.

Note added 24/06/2021 by the Nuffield Trust

NHS investment change - graph

NHS Expenditure (England)

Year

Total net NHS expenditure

% increase

% real terms increase

1997/98 outturn

34,664  

5.1

2.1

1998/99 outturn

36,608  

5.6    

3.0

1999/2000 outturn  

39,881        

8.9 

6.8

Resource budgeting stage 1

1999/2000 outturn

40.201

2000/01 outturn

43,932     

9.3

7.7

2001/02 outturn

49,021   

11.6

9.0

2002/03 outturn

54,042

10.2

6.9

Resource budgeting stage 2

2003/04 outturn

64,123

2004/05 outturn 

69,051 

7.6 

4.7

2005/06 outturn (est.)

75,822    

9.8

7.5

2006/07 outturn (est.)

80,561     

6.3 

3.4

2007/08 outturn

89,568      

11.2

7.7

2008/09 plan

96,213     

7.4 

4.3

2009/10 plan

102,641 

6.7 

3.8

Source: Department of Health 

Payment by results (PbR)

Historically, lump sums had been paid to individual hospitals. Now payments received by Trusts depended on the number of cases handled, paid for on the basis of a national tariff. Known as payment by results (PbR), really payment by activity, money would move with patients. In October 2002, the government issued Reforming NHS Financial Flows – Introducing Payment by Results.44 Block contracts would be scrapped and a tariff-based system would be introduced. The basis for the system of costing had been developed in the USA as diagnosis-related groups (DRGs). An English version of health resource groups (HRGs) had been under development since the Conservatives’ NHS reforms, although the inadequacy of hospital data made the introduction of standard costing systems difficult. HRGs would expand progressively to cover inpatient and outpatient care, and elective and non-elective services in medicine and surgery. Prices would have to vary because costs differed from place to place. The national tariff would need adjustment for local market forces (high cost in London was a fact of life). There were many other factors, for example, the higher expenses of teaching hospitals, the cost of teaching in all hospitals and the funding of services that were complex and essential, almost regardless of demand (for example, burns units). Tariffs progressively expanded.

PbR introduced a moral hazard, well known in the USA, where providers (hospitals and doctors) might over-bill, while purchasers would ‘down code’ (that is, assign a code for a service lower than actually performed). UK providers were accused of driving up costs and exploiting the system by overcharging and purchasers of systematically obstructing and delaying payment. Other problems affected the larger teaching hospitals with their higher expenses. A new system of paying money for medical research was introduced and the large research-oriented hospitals were likely to suffer. The historical allocations might be scrapped in favour of a simpler system that paid an average amount per student year, creating winners and losers. There were few clear lessons from evaluation studies, save that the NHS needed a raft of different but effective commissioning models. Commissioning at single or group practice level might be best for some services. Others with larger populations might need a regional or national system.

Practice-led commissioning, a further initiative in April 2005, allowed PCTs to devolve indicative budgets to practices. Those practices involved had become more financially aware of the consequences of their decisions, but in general, progress had been slow and some GPs thought that they had inadequate support from PCTs in developing commissioning.

Recurrent financial problems

In spite of the high growth, a major financial crisis developed from 2005. The increasingly commercial nature of the NHS financial system made it more difficult to hide deficits and the new system created winners and losers. One problem was that the estimates had not taken account of the extent to which most Trusts had subsidised health care by using non-recurrent money, land sales, and so on. PCTs found themselves more deeply committed than they had expected, and £637 million had to be redeployed from educational budgets to narrow the gap. Nursing school intakes and medical postgraduate education were cut. ‘Over performance’ by some Trusts beggared their PCTs.

Many hospital Trusts, at the end of the financial food chain, found themselves in dire trouble. They were hit by underfunding for the consultant contract and the money spent on buying in extra capacity at high cost to meet targets. PCTs also had to meet the cost of the new GP contract, for the average net salary rose from 2003 to 2005. According to the Parliamentary Select Committee on Health, the Department of Health had miscalculated the additional costs of these contracts and possibly the Agenda for Change as well. Patricia Hewitt told the Parliamentary Select Committee that the NHS had employed more doctors than the central planning had intended or that Trusts could afford. While NHS employers had been involved, the Department had themselves taken many of the key decisions.

In mid-2005 an assessment suggested that six SHAs had no hope of reaching financial balance by 2008. Eleven SHAs would need to make savings of £250 million.45 A deficit of £547 million was only reduced significantly by a huge subvention created by cutting the NHS training and education budget. When in 2006 the King’s Fund analysed where the money had gone, almost half had been spent on higher pay for staff, and another 27% on increases in capital costs, the cost of clinical negligence and on drugs, meeting the recommendations of NICE. Little was left for service development or, indeed, meeting targets such as waiting list reduction. After five years of historically high growth, the NHS still had problems. Why? No single reason but a myriad of activities. Massive expansion in capacity had improved access to services, there had been several structural changes, the introduction of targets that had financial repercussions, and not only were there many more staff but there was a wholesale increase in pay of every group. Consultant pay was £90 million more, Agenda for Change £80 million more and GPs £300 million more than expected. Simultaneously treatment was of increasing complexity. In its programme ’The Blame Game’, on 26 March 2006, the BBC’s ’Panorama’ placed the blame firmly on Department of Health policies and Ministers. A multitude of new policies, not all compatible with each other, made the task of Trusts difficult. Ministers blamed NHS management. The profession blamed government. Some hospital and PCTs openly delayed treatment, while trying to remain within guidelines for speed of care. Others closed wards, and delayed payments. The Department sent ‘Turnaround Teams’ in January 2006 into 18 Trusts. Increasingly it was necessary to cull jobs. By early 2006, there was disarray for a government that had massively increased the money for the NHS, only to find it absorbed by its own policies, targets and pay agreements. As the financial crisis continued to grow, tensions arose between Ministers and civil servants and, in March 2006, the Chief Executive, Sir Nigel Crisp, after five years at the helm, accepted responsibility and took retirement. Sir Nigel’s successor, David Nicholson CBE, giving evidence to the Committee of Public Accounts in October, admitted that the Department “could be better at costing some of our policies.” The BMA blamed the private finance initiative, independent treatment centres and the employment of management consultants. Derek Wanless saw the financial crisis as a threat to health promotion and social care. The Prime Minister saw it as essential to move forward with reform before the next election loomed too closely and the flow of the additional moneys that had been provided lessened.

Patricia Hewitt, Secretary of State for Health, told MPs on 21 November 2006, “we will return the NHS as a whole to financial balance by the end of March 2007 and I will take personal responsibility for that”. Pressure and threats on management and clinicians reduced morale. To attempt to control their budgets, PCTs established teams of financial analysts to look at hospital bills; hospitals had to respond by increasing the costs of their own finance departments. Trusts might delay patient admissions until a time after which they might receive payment, yet meet target times for the completion of treatment. In the spring of 2007, it seemed that some 15 to 20 NHS Trusts were in such deep financial trouble that their position seemed irrevocable within ten years. While the NHS as a whole remained in budget at the end of the financial year, there was an increase in patient waiting times as services were reduced. Patricia Hewitt delivered on her promise that the NHS should end in overall financial balance, though 82 out of 372 NHS organisations had a combined deficit of £97 million, concentrated in a small number of organisations, some of which had problems hard to resolve.

The King’s Fund commissioned Derek Wanless to take stock. The review, published in September 2007, concluded that increases in NHS moneys had broadly matched assumptions made by the 2002 review. Pay and contractual changes for all NHS staff groups over five years had contributed to higher input costs, with benefits yet to be fully realised. NHS Plan commitments to employ 7,500 more consultants, 2,000 more GPs, 20,000 more nurses and 6,500 more therapists (allied health professionals) by 2006 had been more than achieved, with targets exceeded by 16 per cent, 166 per cent, 272 per cent and 102 per cent respectively.

Building, both of hospitals and for primary care, had made much progress. The funding increase had helped to deliver some clear improvements – more staff and equipment; improved infrastructure; significantly reduced waiting times and better access to care; and improved care in coronary heart disease, cancer, stroke and mental health. But, on balance, the flood of cash had brought disappointing results. From 2002 until 2007, NHS funding had risen at 7.2 per cent per year. In October 2,007 the chancellor announced that, for the next three years, spending would increase at 4 per cent per year, more than had been expected but less than the increases to which the NHS had become used. The corner had, for the present, been turned. The NHS had returned to financial balance. Indeed, the year 2007/08 showed an embarrassing underspend, because Trusts had become used to making economies.

Medical progress 

Advice to government had traditionally come from the Standing Medical and Nursing Advisory Committees, but these were abolished in 2005. The new mechanism to produce national strategies was National Service Frameworks (NSFs). NSFs were issued in 1999 for mental health, and for coronary heart disease in 2000. Experienced and senior clinicians who previously were appointed to central positions of responsibility and advised the Chief Medical Officer (CMO), now they became Health Directors, or ‘tsars’ driving clinical policies for cancer, heart disease, mental health or older people’s services.

Increasingly health services dealt with chronic diseases. Improving their management and reducing the many admissions for which such diseases were responsible became a high priority for government.

Health Promotion

Health and premature death are influenced by genetics, social circumstances, environmental exposures and behavioural patterns. The single greatest opportunity to improve health and reduce premature deaths lies in personal behaviour. Obesity and physical inactivity combined are the top two behavioural causes or premature death.46  Here, and on smoking, effort was concentrated. A range of preventive programmes that seemed cost effective included universal ‘flu vaccination for the over 65s, statin therapy for middle-aged people who were at high risk, screening for Chlamydia infection, nicotine replacement, and increased support for counselling for smokers wishing to stop. Labour had promised a new strategy to break the cycle of ill health due to poverty and deprivation. A mind-numbing series of reports appeared – lengthy, repetitive and because of the compromises necessary to avoid encroaching on private liberty, more radical than some wished and less prescriptive than others would have them. Reducing health inequalities became a policy objective. Speaking on public health in 2006, the Prime Minister, Tony Blair, said that the role of government was to enable and help people to act with responsibility. Referring to obesity, smoking levels, drinking habits and diabetes, he pointed out that “these individual actions lead to collective costs”.

Repeated changes in the organisational structure created problems for public health as NHS boundaries seldom matched those of local authorities. Until 2001, most public health doctors worked on the ‘purchaser’ or ‘commissioner’ side, and a few as epidemiologists in hospital Trusts. Each Authority had a Director of Public Health, sometimes with other roles such as Director of Health Strategy and usually with support from other consultants and trainees. However, the move to PCTs created difficulties as these were built on general practitioner registered populations rather than a defined geographical area.

When regions were abolished and regional outposts were integrated into the Department of Health, Regional Directors of Public Health became civil servants. ‘Observatories’ were created to report on the problems at regional level. Non-medical staff concerned with public health – for example, health visitors, health educators and environmental officers – became eligible for membership of the Faculty of Public Health, and the post of Director of Public Health no longer required a medical qualification. Public health ceased to be largely a discipline for doctors who had received additional training, and came to incorporate non-medical disciplines that had always made a substantial contribution to the subject, such as epidemiologists, economists and statisticians. Simultaneously, medical management positions that previously had often gone to those with public health skills were increasing occupied by clinicians. Regional Public Health Groups were formed to match the strategic health authorities, co-located in each of England’s nine Government Offices. 

Acheson Inquiry 1998

Donald AchesonLabour commissioned an inquiry into inequalities in health in 199747 by Sir Donald Acheson, who had been CMO when The Health of the Nation48 was prepared under the previous administration.

The team, composed of scientists but no economist, based their recommendations on published evidence. Because the poor generally lived shorter unhealthy lives, and differences were if anything widening, the key recommendations involved a wholesale redistribution of wealth. Unlike the recommendations of the Black Report (1980), Acheson’s were not costed and he wanted to see the package implemented as a whole. “Affordability is not a matter for scientists but politicians…” he said. Some recommendations were vague, for example, the need to take “measures to prevent suicide among young people” or “policies to reduce fear of crime and violence”. For most recommendations there was no high-quality, controlled studies showing they would improve health – but hard evidence of effectiveness has seldom underpinned changes in health policy. 

Saving Lives: Our Healthier Nation (1999)

In February 1998, before Acheson’s report was published, Labour produced a Green Paper, Our Healthier Nation49– a contract for health, suggesting a partnership between the Government, local organisations and individuals to improve people’s living conditions and health. It saw the need to reduce the widening inequalities in health. The 1992 Health of the Nation initiative had failed to change spending priorities and made no significant impact on health authorities, trusts or GPs. In June 1999, Frank Dobsion, the Labour Secretary of State, followed this with an aspirational White Paper, Saving Lives: Our Healthier Nation, which aimed to improve the health of everyone, particularly the worst-off, taking into account the social, economic and environmental factors affecting health. It reduced the number of health improvement targets to four and set out the contributions to health of social, economic and environmental factors, and the decisions taken by individuals. The new policy was not substantially different from the old one, though the goal was now to improve the health of the worst-off in particular.

Tackling Health Inequalities: A Programme for Action (2003)

Tackling Health Inequalities50 set out a three-year plan on health inequalities to meet the 2010 national health inequalities target on life expectancy (by geographical area) and infant mortality (by social class) and was organised around four themes: supporting families, mothers and children to break the intergenerational cycle of health; engaging communities and individuals to ensure relevance, responsiveness and sustainability; preventing illness and providing effective treatment and care; and dealing with the long-term underlying causes of health inequalities.

Wanless – Securing Good Health for the Whole Population (2004)

In his first report to the Treasury, concerned with financial matters, Derek Wanless had provided three scenarios based on different levels of involvement of the public in relation to their health. Economic analysts within the Department of Health had a substantial input to these scenarios. Later he was asked to provide an update on the challenges, focusing on prevention and the wider determinants of health. Two issues emerged again – regulation versus patient education, and local versus national projects. In 2003/04 he issued a report on population health trends, and his final document, Securing Good Health for the Whole Population51 appeared in February 2004. Wanless thought the costs of the health service would be massively less if there was energetic and effective action by all concerned, including individuals. Wanless drew attention to the problems of smoking, lack of activity and obesity. Key points were that individuals were primarily responsible for their own health and life styles, but lack of information and the context in which they lived might lead to the failure to achieve the substantial improvement in health possible. He pointed to the fact that rigorous implementation of identified solutions had often been lacking and that there was little evidence about the cost effectiveness of public health policies, caused in part by lack of research funding; so there had been many initiatives, often with unclear objectives and little quantification of outcomes. As a result, a further White Paper, Choosing Health was published.

Choosing Health (2004)

Published in November 2004, the principles were informed choice (with the protection of those too young to choose, and ways of ensuring that one person’s choice did not harm others) tailoring proposals to the reality of individual lives, and working together. Choosing Health52 proposed action to increase the number of smoke-free workplaces, curbs on the promotion of unhealthy foods to children, clear labelling of the nutritional content of food, better provision of health information to the public, and NHS Health Trainers to provide advice on life style.

The Health Development Agency

The Health Development Agency, established in 2000 as a special health authority, was the successor to the Health Education Council (1969–1987) and the Health Education Authority (1987–2000). It aimed to develop the evidence base to improve health and reduce health inequalities. It worked in partnership with professionals and practitioners across a range of sectors to translate that evidence into practice. It succumbed to a 2004 review of arm’s-length bodies, and its functions were transferred to NICE on 1 April 2005.

Screening

The National Screening Committee, established in 1996, developed protocols for screening. In 1998 it identified almost 300 screening programmes, many at a research stage and nearly 100 in practice. Only four met stringent criteria for both quality and evidence of effectiveness, breast and cervical screening, and neonatal blood spot screening for phenyl-ketonuria and hypothyroidism. There was a growing body of evidence that screening could harm people, particularly because of false-positive and false-negative results. Breast cancer screening might save lives, but many women had breast surgery that, in the event, proved unnecessary. To Wilson’s earlier criteria (Chapter 2) was added a new one, that there should be evidence from high-quality, randomised controlled trials that programmes were effective in reducing mortality or morbidity. In 2003 it was agreed to work towards a national screening programme for bowel cancer, the second most common cancer in men and women, and testing of stools for blood was introduced in the over sixties in 2007. Further trials would be undertaken for an alternative method common in the USA, flexible sigmoidoscopy.

‘Multi-phasic screening’ as a form of health check had been popularised in the USA in the 1960s. Now a new form of screening emerged and was subsequently introduced by the private sector into the UK – whole body spiral computed tomography (CT) and magnetic resonance imaging (MRI) scanning. Mobile units might offer cardiac, thoracic or abdominal scans. Other organisations provided Doppler ultrasound investigations to the worried well. There were some positive findings – for example, young men with an operable but clinically silent cancer. Such procedures might take only a few minutes and cost £200 each, not an impracticable sum, though the marketing literature seldom spelt out what could be expected in terms of false positives and false negatives, the images often being reported by unnamed clinicians in other countries.

Quality and effectiveness

Quality became a centre point of NHS policy. With a few exceptions, such as the maternal mortality survey, quality initiatives had been unusual in the UK, it being assumed that the ‘producers’ rather than the ‘consumers’ knew what services should be provided. Increasingly NHS had discovered the patient as an expert on the quality of services. Frank Dobson established CHI and NICE. Increasingly, NHS Trusts were assessed on quality, and morbidity and mortality data, though far from reliable, were published.

The USA had led for many years on quality issues. The Institute of Healthcare Improvement, a not-for-profit organisation leading the improvement of health care worldwide, based in Cambridge, Massachusetts, and the US Centre for Quality Improvement and Patient Safety within the Agency for Healthcare Quality and Improvement were models for the English National Patient Safety Agency established in July 2001. John Wennberg‘s Dartmouth Atlas showed unwarranted variations in health care delivery, for example, variations in the rates of coronary artery and carotid artery surgery, even in the best academic centres. Medical errors and ‘near misses’ were commonplace in routine treatment, often the result of systems rather than individuals. An extensive review by RAND Health documented shortcomings in safety and effectiveness.53 A major report in 1999, To Err is Human,54 directed at politicians and health care leaders as well as doctors, was followed in 2001 by Crossing the Quality Chasm,55which showed that the implementation of standardised performance measures for heart attack, heart failure and pneumonia in 2002 was followed by significant improvement in outcome.56 A review five years after the publication of To Err is Human showed that progress was being made, at least in the USA, often led by clinicians keen to improve patient care.57A further Institute of Medicine report in 2008, Knowing What Works in Health Care recommended that Congress establish a programme comparable to NICE in the UK to conduct reviews and develop standards for creating clinical practice guidelines.

Indicators of care quality were shown to be useful in preventing or treating breast cancer, diabetes, myocardial infarction, heart failure, pneumonia and stroke. Assessments of the quality of medical care delivered to Medicare beneficiaries58 formed part of the literature leading the Institute Healthcare Improvement to launch its 100,000 Lives Campaign in 2004, targeting six interventions where quality improvement could save patients: to deploy rapid response teams to patients at risk of cardiac or respiratory arrest; deliver reliable, evidence-based care for acute myocardial infarction; prevent adverse drug; prevent central line infections; prevent surgical site infections; and prevent ventilator associated pneumonia.

Hospital clinical performance also varied substantially in England. The Audit Commission, in its report A Spoonful of Sugar (2001) reviewed the use of medicines in NHS hospitals, and found that medication errors occurred too often and that their effect on patients and NHS costs could be profound.59 The Department of Health published its own analysis, An Organisation with a Memory (2000).60 The National Patient Safety Agency, produced results in 2002 from trials in 28 Trusts which showed over 20,000 adverse incidents over a nine-month period.

Hospitals were given ‘star’ ratings in 2001, covering the process of its care, rather than quality and outcome. The system was refined in 2004 using a wider range of standards, incorporated by the Healthcare Commission in its annual Trust health check. In 2003, the Nuffield Trust published a review61 examining achievement in terms of access, capacity, public perception, effectiveness and equity. It found that much remained to be done, but that initiatives were moving in the right direction.

Professional staffing ratios influenced mortality rates. In the USA, research on nurse ratios in surgical units in Philadelphia showed that, after adjusting for patient and hospital characteristics, each additional patient per registered nurse increased the likelihood of dying within 30 days of admission by 7 per cent, and increased substantially nurse burn-out and job dissatisfaction.62 In the UK, Professor Brian Jarman standardised data for age, sex, socio-demographic background, key diagnoses and the number of emergency admissions.63 Hospital standardised mortality ratios (HSMRs) were calculated, and the factor that correlated best with performance was the number of doctors on the staff. A hospital that exactly matched the national average would score 100; better results would be a figure below 100, and hospitals with substantially higher figures proved, in some cases, to have underlying problems of their clinical care. It was an attempt to measure something both important and elusive. Taken overall, the death rates in major English hospitals seemed to be dropping about 2.5 per cent per year.

Mortality data publication

Mortality rates at hospital level were not published until 1994 in Scotland and 1999 in England – later than in the USA. As in the USA, publication was followed by protests that data might be inaccurate and misleading. An independent company, Dr Foster, was established in 2001 in the UK to exploit the mass of data produced by the NHS using Jarman’s methodology. Based at Imperial College, the unit mined available sources of information and the Department of Health became its main customer. The method of comparing hospitals was based on the HSMRs, the number of patients dying after a range of procedures in hospitals, adjusted for the complexity of cases and other factors. A relationship existed between the volume of some procedures and the outcome of treatment. In 1996, the NHS Centre for Reviews and Dissemination published the procedures where this relationship existed. It included coronary artery bypass surgery, paediatric heart surgery, acute myocardial infarction, coronary angioplasty, aortic aneurysm, amputation of the lower limb, gastric surgery, cholecystectomy, intestinal operations, knee replacement, and neonatal intensive care. US work also showed that hospitals with high annual volumes of certain types of procedures had lower death rates; the true association was probably with surgeons who had a high volume of cases.

Variations in the performance of clinicians

The ethos of medicine was changing and the development of controlled trials, meta-analyses, guidelines and organisations such as the Cochrane Collaboration was bringing to an end an era in which clinical experience alone was seen as adequate. Evidence-based medicine was becoming an integral part of undergraduate, postgraduate and continuing educational programmes, with accurate, accessible and regularly updated sources of information. Few would disown the hypothesis that, on average, providing evidence-based procedures would improve clinical outcomes. The Cochrane Collaboration (1992), funded by the NHS Research and Development Programme to help well-informed clinical decisions by preparing, maintaining and ensuring the accessibility of systematic reviews, rapidly became an international movement, as the Cochrane Library and Database of Systematic Reviews. How could stroke and its effects be prevented and treated? What drugs should be used for malaria? Studies could produce compelling evidence for a change to clinical practice. Often it was clear enough what should be done, for example, elderly people should receive influenza vaccine.

Clinical mishaps

The NHS and professionals were exposed, increasing public scrutiny. When serious harm to patients occurred, it was common to hold an inquiry, sometimes a public one. In the 1970s, many were held into mental illness, mental handicap and geriatric hospitals. The same issues were identified repeatedly, for example, isolation – geographical or organisational, poor leadership, bad communication, individual’s competence and obstacles placed in the way of those who raised concerns at an early stage. Had the media become more active, were staff less disciplined, was the system under too much pressure? Were there interests that relished the demonstration that professionals had feet of clay? Financial problems were often involved, as in the case of delayed admissions, waiting lists and ‘trolley waits’, as did pressure of work and inadequate supervision.

In Bristol at least 29 babies died after heart surgery at the Royal Infirmary, leading in 1998 to a lengthy public inquiry costing £14 million, chaired by Professor Ian Kennedy, with a final report published in 2001.64 There were organisational as well as clinical problems. The hospital facilities were far from ideal for the number of staff and the split site. The GMC found that three doctors were found guilty of serious professional misconduct. The BMJ said the most chilling thought was that there could have been many similar reports about other parts of the NHS.65 The ingredients in Bristol occurred throughout the NHS. The NHS had no system for monitoring quality, no reliable data and no agreement on what constituted quality. “Thus the most essential tool in achieving, sustaining, and improving quality of care for the patient was lacking . . . clinicians had to satisfy only themselves that the service was of sufficient quality.” Ian Kennedy later became the chairman of CHAI. The media now watched the cases coming in front of the GMC, which received an increasing number of complaints: 1,000 in 1995; 3,000 in 1999; and 4,470 in 2000.

The Shipman Inquiry and the GMC

The conviction of Harold Shipman for serial murder dealt a further blow to confidence in the medical profession and its systems of self-governance. A GP, for many years in single-handed practice and with a past history of drug abuse, Shipman had administered heroin to many of his elderly female patients. He was convicted at Preston Crown Court in January 2000 of the murder of 15 patients, sentenced to life imprisonment, and later committed suicide. In September 2000, a public inquiry into the case was ordered and Dame Janet Smith, a High Court judge, was appointed chairman. The estimate of his victims rose to perhaps 200. The sixth report said that he had begun killing early in his medical career while in the hospital service. The fifth report of the Shipman Inquiry stated that the GMC had “fundamental flaws”.66 There was an onslaught on the regulation of the medical profession and massive reorganisation of the GMC – the coroners’ service and cremation rules changed as well.

The Royal Colleges, the BMA and its General Practice Committee, favoured ’re-accreditation’, the regular review of a doctor’s work. The GMC unanimously voted in 1999 for a system of continuing education and regular supervision of the standards of those in practice, and issued a consultation document on Revalidating Doctors in 2000, starting a lengthy debate on how this could best be done. The government went to consultation on reforming the GMC structure and constitution, increasing the lay membership. The Medical Act 1983 (Amendment) Order 2002 was made in December 2002, changing the GMC constitution to include 14 lay and 21 medical members. A system of regular appraisal and revalidation of all doctors in the NHS was agreed. The changes, outlined in A Licence to Practice and Revalidation, published in April 2003, made a licence a legal requirement for all registered doctors in the public or private sectors.

The GMC’s system of revalidation was criticised, and Sir Liam Donaldson, the government’s CMO, was asked to review the role of the GMC. Subsequently, new proposals were made, culminating in new legislation. 67

Clinical governance

Several reports had identified the characteristics of good governance in organisations, integrity, openness and accountability, and government saw ‘clinical governance’ as a tool that would deliver quality. Clinical governance seemed to involve corporate accountability for clinical performance, alongside the other managerial responsibilities. An early definition was that it was “a system through which NHS organisations are accountable for continuously improving the quality of their services and safeguarding high standards of care by creating an environment in which excellence in clinical care will flourish”.68 Clinical governance reflected a fundamental change in powers and responsibility of managers and consultants; consultants could no longer regard themselves as free-floating entities; Trusts and consultants had mutual responsibilities. However, some saw the process as jargon-rich and progress was, in any case, slow.

Body parts retention

Pathologists who had, for decades, had routinely kept blood specimens, histological slides and pathological preparations found that this practice was unacceptable to some, once it was public knowledge. At Alder Hey Hospital, matters had gone further. A report by Michael Redfern QC showed that a huge collection of organs had been built up without full parental consent being obtained, or in some cases even sought. An enquiry by the CMO, Liam Donaldson, showed that more than 105,000 body parts were stored in English hospitals and medical schools, half dating from more than 30 years previously. This was no surprise to any British trained doctor, part of whose professional education had consisted of the study of preserved specimens showing the marks of disease; though, at times, the retention of organs had gone far beyond that required for bona fide research. New legislation was introduced to ensure that informed consent was obtained.

The costs of negligence

The costs of settling legal actions for clinical negligence were rising. Pressure to put more people through the system did nothing to improve quality. Traditionally the costs of compensation had fallen on the individual clinician’s insurance, but it shifted first to the hospital concerned and, in April 2001, to the NHS Litigation Authority. Much money went to very large compensation awards to provide long-term care for badly damaged patients, for example, infants brain damaged at birth. 

Regulation

Government was becoming increasingly impatient with professional self-regulation and the time taken by GMC procedures to come to decisions. High-profile cases led to new measures to monitor performance and to ban doctors swiftly who were dangerous and incompetent, challenging the medical profession’s traditional independence. In 1998, a consultation paper, A First Class Service,69 heralded change. Under the National Health Service Reform and Health Care Professions Act 2002, the government established the Council for the Regulation of Healthcare Professionals, (later the Council for Healthcare Regulatory Excellence). The Council, which had been called for in the Kennedy Report, would have members representing public interests and the NHS as well as from the professions. It would oversee the GMC, General Chiropractic Council, General Dental Council, General Optical Council, General Osteopathic Council, Health Professions Council, Nursing and Midwifery Council, the Pharmaceutical Society of Northern Ireland, and the Royal Pharmaceutical Society of Great Britain.

In addition, Labour created five regulatory agencies: NICE, CHI (later the Healthcare Commission), the Modernisation Agency, the National Patient Safety Agency, and the National Clinical Assessment Authority. All were well resourced, most were ultimately accountable to central government, and part of a wider movement towards regulation and audit. There was duplication, the number of organisations concerned with quality and regulation spiralled, and a concordat was agreed in 2004, aiming to bring order to the chaos. 

Two approaches might be adopted:. a ‘person’ approach looked for individual errors;. a broader approach looked at the entire system and introduced risk management, with reporting of problems to ensure their rectification. Regulatory bodies might follow a sanction-driven approach, or attempt to stimulate development on the assumption that most staff were essentially well-meaning, and would improve their performance given the chance.

National Clinical Assessment Authority

The NHS Plan of July 2000 established a National Clinical Assessment Authority as a Special Health Authority which in 2005 became part of the National Patient Safety Agency.

National Patient Safety Agency (NPSA)

Building a safer NHS for patients70 put an emphasis on reporting adverse events and risk reduction. The NPSA was established in July 2001, a “mandatory system for logging all failures, mistakes, errors and near-misses across the health service” to feed lessons learned back into practice, and work alongside CHI. In 2005, the Agency expanded to incorporate the National Clinical Assessment Service and the National Research Ethics Service. It also took over the funding of three confidential inquiries into: Suicide and Homicide, (hosted by the psychiatry division of Manchester University Medical School), Maternal and Child Health, (hosted by the Royal College of Obstetricians and Gynaecologists) and Patient Outcome and Death (a charity and limited company representing Royal Colleges).

The National Institute for Clinical Excellence (NICE)

As part of the Comprehensive Spending Review, officials were asked to look at ways of improving efficiency and effectiveness in the NHS. The report proposed a systematic and national approach to the appraisal and management of new technologies. The White Paper, The New NHS – Modern, Dependable71 (1997), announced that NICE would give guidance on drugs and other technologies. NICE profoundly changed the way that England and Wales evaluated health interventions. Established in April 1999, with Professor Sir Michael Rawlins, an eminent clinical pharmacologist, as chair, it brought together a number of organisations that worked on quality. Its functions were: appraising new technologies, including drugs, and from 2003, to assess whether interventional procedures used for diagnosis or treatment were safe enough and worked well enough for routine use in the NHS; deciding which should be encouraged in the NHS, and which should be held back; producing or approving clinical guidelines; identifying ways of improving the quality of care, and encouraging quality improvement; and funding the four confidential inquiries into maternal deaths, stillbirths and deaths in infancy, perioperative deaths, and suicides and homicides by people with mental illness. (These studies were later transferred for funding to the National Patient Safety Agency).

NICE was a Special Health Authority accountable to the Department of Health, initially a small organisation Almost immediately, faced with a likely ‘flu epidemic and the release of Relenza by Glaxo, Frank Dobson, the Secretary of State, asked whether the drug should be recommended for NHS use. A fast-track review for the next year, considering the evidence that it reduced the length of the illness by only a day, and its use would have a major impact on GP workload, recommended against it. The medical profession and the media were supportive of NICE, and the decision was backed by Tony Blair, the Prime Minister. By April 2005 it had issued 86 guidelines, often important to specific but small groups of patients. NICE did not accept or reject health care technologies on cost-effectiveness grounds alone, although this was a major factor, and operated to a set of principles. However, above £30,000/quality adjusted life year (QALY), the case for supporting the technology had to be increasingly strong – draft guidance rejecting drugs for kidney cancer was issued on this ground (2008). Some criticised the threshold as too low, excluding some treatments; others that it was too high, with the result that the health service had to economise on forms of care that were possibly more worthwhile, but had not been mandated by NICE.

Steadily NICE’s credibility grew and it became a major feature of the landscape of clinical medicine. Its guidelines were increasingly welcomed, and its work began to achieve international recognition. NICE was unusual in having a formal appeals process. From 1 April 2005, it was renamed the National Institute for Health and Clinical Excellence (still known as NICE). Regularly government would suggest new groups of drugs and procedures for assessment, new rather than existing patterns of treatment.72

The drug treatment of disease

An increasing number of expensive but clinically effective drugs, sometimes ‘life-style’ in nature, were introduced. Prozac, HRT and Viagra might improve qualities of life to which medicine had previously paid less attention. In January 2001, NICE approved three new drugs for the treatment of mild or moderate Alzheimer’s disease. A new anti-obesity drug, Xenical (Orlistat), offered the possibility of reducing fat absorption; NICE agreed in 2001 that it could be prescribed when patients were motivated to lose weight, obesity was significant and was posing a threat to health. Two other anti-obesity drugs, sibutramine and rimonabant also appeared, later to be withdrawn on safety grounds. In 2002, NICE recommended the use of bupropion (Zyban) and nicotine replacement therapy (NRT) for smokers who wished to quit.

Expensive yet effective drugs increasingly appeared for common conditions, for cancer, coronary artery disease and stroke. Statins represented the largest drug cost to the NHS (£1.1 billion in 2004) though the cost of simvastatin fell greatly when out of patent. Herceptin achieved a major reduction in the recurrence rate of an aggressive form of breast cancer in those with the HER-2 gene – but cost £20,000/year. Adults with diabetes of the insulin-resistant type stood to benefit from a new class of drugs, the thiazolidinediones. Organ transplantation became more reliable with the development of new immune-suppressors. The treatment of benign prostatic hyperplasia, a common condition, was improved by combination therapy with doxazosin and finasteride which reduced the risk of acute urinary retention and the need for surgery.

A new group of drugs derived from monoclonal antibodies began to appear. Infliximab (1998), an artificial antibody, was introduced for autoimmune diseases such as rheumatoid arthritis, systemic lupus erythematosus, ankylosing spondylitis and psoriasis. It worked by blocking tumour necrosis factor alpha (TNFα). TNFα is a chemical messenger (cytokine) and a key part of the autoimmune reaction. Infliximab blocks the action of TNFα by preventing it from binding to its receptor in the cell.

Drug safety

Checking drugs for safety and efficacy had existed since the thalidomide disaster in the 1960s. Now the European Medicines Evaluation Agency brought together the resources of European Community members and was based in London. In April 2003, the Medicines Control Agency merged with the Medical Devices Agency as the Medicines and Healthcare Products Regulatory Agency, with a role similar to that of the Food and Drug Administration (FDA) in the USA.

In 2006, at the clinical trial stage, the testing of a monoclonal antibody TGN1412 at Northwick Park Hospital led to multi-organ failure simultaneously in six young men; all survived, but some suffered permanent injury. While reporting systems made it possible to spot increased numbers of rare events such as liver failure, a raised incidence of a common problem, for example, heart attacks, was harder to discover. A new anti-depressive, Paroxetine (Seroxat) seemed to be associated with a raised incidence of suicidal thoughts and suicide. Cox-2 inhibitors were introduced in 1999 for the treatment of arthritis in the hope that gastro-intestinal side effects would be less common. They received massive publicity, often direct to consumer and combined sales exceeded US$5 billion annually. After five years, evidence emerged of higher risk of heart attacks and stroke, apparently an effect of all drugs in that class. In September 2004, Merck withdrew rofecoxib (Vioxx) from the market. The belief that the company had not drawn attention to early indications of problems led to Court actions and substantial awards. Pfizer withdrew Celebrex (celecoxib), there being well-established alternatives for the treatment of all the approved indications for their use.73

Off-label prescribing

The prescription of a medication in a manner different from that approved was legal and common, often in the absence of adequate supporting data. Off-label uses had not been formally evaluated and evidence provided for one clinical situation might not apply to others. It usually entailed use for unapproved clinical indications or in unapproved subpopulations (for example, paroxetine [Paxil] for depression in children), from a presumed drug class effect, extension to milder forms of an approved indication, or an extension to conditions whose symptoms overlap with those of an approved indication. The pharmaceutical industry wanted to enlarge its markets to ensure future profits and sustain drug development. The public wanted drugs that were safe, evidence-based, and affordable, but also wanted the newest therapies; and physicians wanted the autonomy to prescribe drugs for individuals regardless of the state of approval.

Many patients consulted their doctors after hearing of drugs on TV or in the papers. Europe differed from the USA in its approach to Direct to Consumer Advertising (DTCA) of prescription drugs. In Europe, governments inevitably wished to control drug budgets, while in 1997, the FDA further relaxed controls in the USA. DTCA was a powerful tool, designed to create demand and maximise profits by encouraging patient demand. Bob Dole, a former US vice-presidential nominee, appeared on US TV in a commercial for erectile dysfunction – paid for by Pfizer. Some drugs had become household names – Viagra, Prozac for depression, Claritin for allergies, Rogaine for baldness and Matrix for migraine. DTCA was often inaccurate; from 1997 to 2001, the FDA in the USA issued 94 notices of violations because benefits of the drug were hyped and risks minimised. New Zealand reviewed DTCA and concluded that the benefits did not outweigh the harms; similarly in Canada and Australia. More drugs were cleared for OTC sale by a pharmacist, for example, simvastatin in 2004. Candidates for OTC were usually drugs for non-chronic conditions that patients could easily self-diagnose, with a low potential for harm from widespread availability. Three factors motivated OTC sales: the patient self-help movement; the desire to minimise costs to the public purse; and the belief by the company that this was profitable.

Radiology and diagnostic imaging

Imaging technology was increasingly central to accurate diagnosis. The promise of digital radiographic systems was fulfilled as hospitals were re-equipped with systems, allowing images to be rotated, enlarged and manipulated. The filmless hospital was commonplace by the end of the decade. The Picture Archiving and Communications System (PACS) increasingly enabled images to be stored and mailed electronically rather than printed on film and filed manually.

CT scanning was developing rapidly. In the late 1980s, the rotation of the ray source was combined with continuous movements of the table on which the patient was placed. Because the tube was rotating while the subject moved smoothly through the scanner, the ray beam described a spiral pathway. This meant more rapid scans, closer spaced cuts, and a scan within a single breath-hold, giving three-dimensional images. For example, the colon could be viewed in exquisite detail. Multi-slice scanners were introduced, with not one row of detectors, but up to eight. Image acquisition was faster and a larger area could be covered.

MRI became a rapid imaging tool for the whole body. Widely useful, particularly for neurological and cardiac disease, it could reveal skeletal spread of cancers, sometimes when no primary tumour had been found, and avoid radiation risks. Positron emission tomography (PET) had a similar and increasing range of uses. In 2000, Nutt and Townsend working first in Geneva and later in Pittsburgh, combined CT scanners with PET. The result was a single machine that could simultaneously image anatomical structure, (for example, cancer) and metabolic processes, reducing patient discomfort. 

Progressive development of imaging systems aided the development of minimal access surgery. Image-guided surgery used imaging systems during a surgical procedure to assist its performance. MRI was the best technique, but it had not been practical because the machines had been fully enclosed. The first truly open scanner, installed in Boston in 1994 and subsequently in St Mary’s Paddington in 1999, gave surgeons full access to any part of the patient’s body, while scanned. Endoscopic views could be combined with MRI images, and important structures could be identified and safeguarded during surgery. Electron beam imaging was also under development.

Conformational radiotherapy, which enabled accurate dosage to be given to precisely the required field, even if it was irregular in shape, enabled higher doses to be given with less tissue damage and, hopefully, better outcomes. Intensity modulated radiotherapy enabled different doses to be administered to different parts of a tumour, reducing radiation, but was only available in a few departments. 

Communicable disease and immunisation

Terrorism and the 9/11 attack on the Twin Towers led governments in the UK and Europe to strengthen the EU’s communicable diseases network to cover bioterrorist attacks as well as the threat posed by worldwide mobility of people, infections, food and products.

Getting Ahead of the Curve

In January 2002 the CMO launched Getting Ahead of the Curve,74 to improve the system of preventing, investigating and controlling the threat of infectious diseases threats. A new Health Protection Agency (HPA) combined the existing functions of the Public Health Laboratory Service, the National Radiological Protection Board, the Centre for Applied Microbiology and Research, and the National Focus for Chemical Incidents. In August 2002, Sir William Stewart, formerly of the Microbiological Research Authority, was appointed to chair the HPA. Forty-two local Health Protection Teams were formed, with access to expert advice on chemical and radiological issues, health emergency planning, and communications.

Infectious diseases

Drug resistance and anti-bacterials

Staphylococcus resistance had been a problem in the 1950s and 1960s. In the 1980s, infection by methicillin-resistant Staphylococcus aureus (MRSA) began to increase once more. The Standing Medical Advisory Committee (SMAC) reported on the threat in 1998 – The Path of Least Resistance.75 SMAC believed that up to 75% of antibiotic use was of questionable value, yet society demanded easy answers and there was increasing use of broad-spectrum antibiotics. By 2000, roughly a third of strains isolated in hospitals were resistant, some now showing resistance to the only remaining antibiotic, vancomycin. There had been a tenfold rise in numbers over ten years. The National Audit Office reported in 1999/2000 that dirty hands and unsanitary conditions in hospitals caused 5,000 deaths a year and over 100,000 inpatients became seriously ill with infections. Voluntary surveillance of hospital-acquired infection had long been undertaken; government took the lead, making reporting mandatory. By 2006, Clostridium difficile was rivalling MRSA as a cause of hospital-acquired infection and death in hospital and, after a major outbreak at Maidstone/Tunbridge Wells Trust, the chief executive resigned.

Until the 1980s, a steady stream of new antibacterials had become available. There was now little new investment in them. The continuing emergence of resistant strains made this a worry. Three approaches were followed: the modification of existing agents; genomic methods; and vaccine development. Agents such as the fluoroquinolones, active against anaerobes and streptococci, appeared. In January 2001, Zyvox (Linezolid), a synthetic antibiotic of the oxazolidinone class used for the treatment of serious infections caused by multi-resistant organisms including MRSA, was approved. As a protein synthesis inhibitor, it stopped the growth of bacteria by disrupting their production of proteins – that is, it is a bacteriostatic agent, not bactericidal. With the ever-increasing microbial resistance, accurate information on antibiotics and their proper use became essential. A free peer-reviewed database was provided on the Internet by Johns Hopkins, Baltimore.76

Tuberculosis

Annual notifications of tuberculosis in England were steady at roughly 6,000, although the number in London was rising steadily. Around seven out of every ten people with TB came from an ethnic minority group, and two-thirds of people with TB were born abroad. Six per cent were multi-drug resistant. It was an increasing global problem because of the breakdown in health services in parts of the developing world, the spread of HIV infection and the emergence of multi-drug resistant tuberculosis – that is, resistance to two key first-line drugs, rifampicin and isoniazid. Extensively drug resistant tuberculosis (XDR) with resistance to at least three of the six classes of second-line agents was almost untreatable, and the first British case (in a Somali) was admitted in 2008.

AIDS and sexually transmitted diseases

Internationally, AIDS deaths reached record levels and, in 2007, it was estimated that 33 million people were living with AIDS/HIV and that 25 million had died. Of the cases, two-thirds were in Africa, one -ifth in Asia, and China increasingly figured. In Thailand, AIDS first affected homosexuals then drug users, prostitutes, their clients, the wives and girlfriends of the clients, and then the children of those women. In the UK, by 2007, about 18,000 deaths were known to have occurred, and an estimated 77,400 persons of all ages were living with HIV, perhaps a third of whom were unaware of their infection. In the West, the introduction of highly active antiretroviral therapy transformed the prognosis, dramatically reducing infections and mortality. In the USA, while infections contracted homosexually had been stable from 1992 to 1996 and the death rate was falling, the rate of new infections doubled between 1996 and 2000, because more people were healthy and were less scared of unprotected sex. In 2000, heterosexual sex became the leading cause of new infections in England. Two-fifths of newly diagnosed persons probably acquired their infection in the UK, of whom approximately two-thirds were men who had sex with men. Sixty per cent of new cases were in London, and more than half of these in ethnic minority populations. The patients might be on student visas, seeking asylum, or less commonly seeking treatment not available in their own countries. Increasingly, HIV was drug resistant: 27% of new cases in 2000, compared with 14% in 1994.77In 1999, it became policy to offer HIV testing to all pregnant women, and with their multi-racial populations the incidence of maternal infection was highest in the inner cities. In a tragedy that attracted little attention, over 1,000 haemophiliacs contracted HIV from therapy, of whom 600 died.  The former Solicitor General, The Rt Hon (Peter) Archer of Sandwell QC headed an independent public inquiry into what Lord (Robert) Winston had described as “the worst treatment disaster in the history of the NHS”. 

A tendency to complacency in the western world contrasted with the tragedies elsewhere. There were only two approaches to the epidemic: preventing new HIV infections; and anti-retroviral treatment for those needing it. In established HIV infection, treatment included a combination of three potent anti-viral drugs, possibly four. In 2007, three new agents were developed for the management of drug resistant virus: two new classes of anti-HIV drugs – entry inhibitors and integrase inhibitors – and a second generation of non-nucleoside reverse transcriptase inhibitors (NNRTIs). There was new hope for patients in ‘deep salvage’ who were resistant to currently available drugs. For the third world, new low-cost drug regimes, administering zidovudine in pregnancy, offered a chance of reducing the infection of the fetus by the mother by some 80 per cent; more intensive treatment from 28 weeks of pregnancy was even more successful. But though drug companies might offer special deals, funds were hard to find in the third world and “dismal” was a charitable way of describing the treatment-coverage rates in many countries. The problem was compounded by the unwillingness of some fundamentalist churches to talk about condom use.

Other sexually transmitted diseases

Cases of gonorrhoea in England and Wales rose from 16,470 in 1999 to 18,710 in 2007, with a peak of 25,000 in the middle of this period. Over 40 per cent of the cases were in London. Until 1998, the number of cases of infectious syphilis remained stable among both sexes in England, but then rose rapidly to 2,437 in 2008. Outbreaks were often localised, and might have different characteristics, for example, predominantly among men sleeping with men, or commercial sex work and crack cocaine use.

Cases of chlamydia, the most common sexually transmitted infection seen in clinics, had been rising since 1993, partly as a result of improving awareness and diagnosis – from 56,991 in 1999 to 121,986 in 2007. The highest incidence per 100,000 was in black ethnic groups. A national chlamydia screening programme for women aged 16–24 began in 2003, and a programme of immunisation against human papilloma virus started.

Bovine spongiform encephalopathy (BSE)

The Labour government established a judicial review of the handling of BSE which reported in October 2000. Twenty-eight Ministers, civil servants and scientists were criticised in the Report.78 A culture of inter-departmental dispute between the Ministry of Agriculture and the Department of Health, and unnecessary secrecy, was exposed. The public assurances of safety, given at the most senior level, had been flawed at the time they were given. The then Chief Veterinary Officer, Keith Meldrum, and the CMO, Sir Donald Acheson, were accused of glossing over potential health risks. A government compensation scheme was introduced to victims of Creutzfeldt-Jakob disease (CJD) and their families.

There were continuing cases of new variant (vCJD), ten in 1996, ten in 1997, 18 in 1998, 15 in 1999, 28 in 2000 and, by November 2007, a total of 166 confirmed and probable cases throughout the UK with 162 deaths had been reported. The numbers were beginning to decline. Though infection in cattle was most marked in the UK, it was also reported in France and Germany, and human vCJD was found in France. A cluster of five cases in a single small village, Queniborough, provided useful information. All patients had lived in the village between 1980 and 1991, and died between 1998 and 2000, providing some indication of the incubation period in their cases. Initially it was not known if vCJD was transmitted by blood products but, in 1998, it was decided not to use UK blood plasma, and instead to import this from the USA. Whole blood was also treated to remove white blood cells. Of those known to have vCJD, at least nine had been blood donors so there was a small risk of infection from transfused blood. In December 2003, a case of vCJD was reported in a patient who had previously received blood from a donor incubating the disease and two more followed. In 2004, people who had received blood transfusions were stopped from being blood donors. Blood donated by a small number of people who went on to develop vCJD was traced. Some had received direct one-to-one transfusion of whole blood and they were contacted and told about the risk that they might face. Plasma from the same donors, used to manufacture products such as clotting agents, was traced and recipients contacted. Some 6,000 people were involved. A study of 12,500 specimens of tonsils and appendix showed three cases of the prion responsible for vCJD, suggesting that perhaps 3,000 people carried the infection. Guidelines in 2001 suggested that disposable surgical instruments should be used in operations such as appendicectomy and tonsillectomy but, after operative difficulties followed their use, they were withdrawn.

Hepatitis C

Compensation (£20,000) became available to people who had been infected with Hepatitis C from blood transfusions and developed liver cirrhosis, after the hazard was recognised but before the introduction of routine screening in April 1991. This was extended to cover a wider range of people infected as a result of being given blood products by the NHS. 

Other infectious diseases

Britain’s biggest outbreak of Legionnaires’ disease occurred in July and August 2002 in Barrow-in-Furness, when a cloud of infected steam from an inadequately maintained air-conditioning unit passed over the town centre. There were seven deaths and a total of 179 cases.

In 2003, an unusual form of pneumonia was recognised – severe acute respiratory syndrome (SARS) – the first pandemic of the twenty-first century. It was due to a new coronavirus emerging in Guangdong Province in mainland China, and affected more than 8,000 patients and caused 774 deaths in 26 countries on five continents.

Food-borne infection continued to cause concern. A survey of 70 general practices produced an estimate of 9 million cases annually, most of which were never seen by a GP. Campylobacter was the most common bacterial isolate, but in the majority of cases, none of the main food poisoning organisms was identified. Milk-borne coliform infection in Cumbria received national attention and Norwalk-like virus gastroenteritis affected a number of large cruise ships.

Until 2002, the UK had the luxury of being free from rabies. That year, the first case of indigenously acquired rabies occurred in over 100 years. A naturalist and bat-handler died from European bat rabies, and infection was verified in bats in various parts of England. People bitten or scratched by a bat now had a small risk of developing rabies.

Immunisation

In 2000–2001, about 94.5 per cent of children had been immunised against diphtheria, tetanus and polio by their second birthday, and about 94 per cent of 2-year-olds had been immunised against Haemophilus influenzae b, 94 per cent against pertussis, and 87 per cent against measles, mumps and rubella (MMR).

Over the previous ten years, meningitis as a result of meningococcal infection had steadily increased. More cases were associated with septicaemia, and more children and teenagers were dying. A newly emergent strain, meningococcal group C, was largely responsible and a new vaccine for this strain was introduced into the routine childhood programme. In 2001 there were 79 confirmed cases of meningitis C and three deaths, compared with 551 cases and 47 deaths in 1999 before the vaccine was introduced. Routine infant immunisation against Haemophilus influenzae, introduced in 1992, also continued to prove successful. In 2001, a campaign to immunise travellers on the annual Hajj Muslim pilgrimage to Mecca, estimated at 50,000, dramatically cut the number of cases from 45 cases in 2000 to just six, with no deaths.

A paper by Andrew Wakefield and colleagues from the Royal Free Hospital published in The Lancet 79 suggested a linkage between the combined vaccine and some cases of autism. Though the scientific limitations of the paper were immediately apparent, the media took up the scare story and rates for MMR immunisation fell. Neither studies suggesting that the vaccine was safe nor government campaigns alleviated anxieties. There was a fall in uptake from 91 per cent to below 80 per cent. Outbreaks of measles followed, with 740 cases in 2006 and 971 in 2007, and the first deaths since 1992. In 2004, a Sunday Times journalist, Brian Deer, published an investigation into the paper, uncovering the possibility of research fraud, unethical treatment of children, and Wakefield’s conflict of interest from involvement in a law suit against manufacturers. The GMC launched an investigation which resulted in Wakefield being struck off. The Lancet retracted the paper in 2010 and, subsequently, Deer unearthed clear evidence of falsification. Clinical records could not be fully reconciled with descriptions, diagnoses or case findings published in the journal. Late in the day, the media changed sides and supported immunisation. Children had suffered, and energy and money had been diverted from efforts to understand the real causes of autism.

In August 2004, the routine programme for children was modified, and a single vaccine against five diseases was introduced: diphtheria (D), tetanus (T), pertussis (a new acellular whooping cough vaccine), and because of the risk, (one in a million, of live vaccine reverting back to the wild type), inactivated polio vaccine replaced live oral vaccine, plus Hib – (haemophilus influenzae type b). In 2006, a vaccine for pneumococcus was also introduced into the childhood programme.

The development of a vaccine against infections with human papilloma virus (HPV), which though often benign, could lead to cervical and anogenital cancer, opened the possibility of preventing a substantial number of cases. In 2006, the vaccine received its European licence and a programme of vaccination, starting with young girls aged 13, began in September 2008 with a massive nationwide campaign.

Alternative medicine

Complementary, or alternative, medicine remained in public demand. The common factor seemed to be the time and patience of practitioners, commodities in short supply in the NHS. The Prince of Wales, like many of the Royal family, was a long-standing advocate for these therapies or at least for research into their effectiveness. In November 2000, the House of Lords Select Committee on Science and Technology, chaired by Lord Walton, reported that there was scant evidence that alternative remedies worked. Yet the public spent £1.6 billion annually and 50,000 practitioners were treating some 5 million patients. Only for osteopathy, chiropractic and acupuncture was there limited evidence of efficacy. The report commended some aspects of herbal medicine as a large number of effective drugs are of herbal origin, but there was no convincing evidence to support homeopathy, which stuck out as being no more than a placebo. Complementary measures such as the use of massage and aromatherapy gave some comfort to patients with terminal illnesses, even though this had no effect on the progress of the disease. The subcommittee was particularly concerned about dangerous and inaccurate information that appeared in some media articles and on the Internet. Better regulation of practitioners, and control of misleading product labelling, were needed. 

In December 2001, the Department of Health expressed a willingness to consider the provision of some forms of complementary medicine within the NHS, subject to evidence of its effectiveness. Almost certainly such evidence would be a long time coming. How, in any case, could one regulate and examine people in an ever-changing and expanding number of alleged disciplines, most of which had the scientific background of snake-oil salesmen? In 2004, the Department of Health provided £900,000 to The Prince of Wales’s Foundation for Integrated Health to support its work in developing “robust systems of regulation for the main complementary healthcare professions”. The foundation published a guide online to help people understand the main alternative therapies available. The guide avoided the obvious question, “does it work?” Leading doctors spoke out against alternative therapies and a group wrote to the chief executives of all acute and PCTs, urging them to stop paying for unproven or disproved alternative medicine at the expense of more conventional treatments.80

Genetic medicine

Genetics offered the possibility of early identification of people likely to become ill, perhaps from vascular disease and diabetes, because some disorders are preceded by a prolonged presymptomatic period. This opened the possibility of preventing disease, instead of diagnosing it at a later stage. Genetics also provided an insight into the cause of many diseases at a molecular level, understanding its mechanism, rather than just describing it. Diseases previously thought of as one type could be separated into categories with a different origin, and therefore treatment. A computer algorithm designed to seek out those breast tumours that had the most similar genetic profiles and cluster them together, revealed that 98 cancers fell into two main groups that could be recognised on the basis of the activity of 70 genes. A woman with a ‘poor’ 70-gene signature would be 15 times more likely to suffer a recurrence within five years than a woman who had a ‘good’ genetic signature and who might possibly be spared aggressive chemotherapy with all its side effects. The possibility existed of developing smart drugs with an appropriate therapeutic action. Genetics provided the pharmaceutical industry with a wealth of new targets against which to design drugs; suddenly the industry went from famine to feast and trials of gene therapy in a number of single gene diseases were under way.

The Human Genetics Commission (HGC) was created in 1999 to provide the government with advice on genetics and the wider social and ethical issues involved in the use of genetic data in insurance. The Commission’s concerns that the results of genetic tests might be used by insurance companies to the detriment of the population were examined, and it was agreed that insurers could use genetic test results for assessing the risk of Huntington’s disease. 

Academic centres evolved into regional centres serving populations of 2 to 6 million with ‘hub and spoke’ systems to clinics in district hospitals providing access to new developments, clinical diagnosis, laboratory (DNA and chromosomal) diagnosis, genetic counselling and the care of extended families long term. In 2003, the White Paper Our Inheritance, Our Future painted a vision of a future health service offering personalised care based on a person’s genetic profile.81 Two National Reference Laboratories to help the NHS keep abreast of new genetic testing methods and discoveries, were formed in Manchester and Salisbury in 2002.

The lengthy hunt for the structure of the human genome accelerated as commercial interests united, with the international programme led by the American National Institutes of Health. A draft structure for the entire genome was announced in June 2000 and the virtually complete mapping of the human genome was completed between 2003 and 2006.

The development of preimplantation genetic diagnosis (PGD) could ensure that a newly conceived embryo was not a carrier of some genetically determined diseases. In 2004, the Human Fertilisation and Embryology Authority (HFEA) agreed to the use of PGD for assessing some risks and, by 2006, a test for the BRCA1 or BRCA2 gene mutation or any of hundreds of other diagnostic assays became directly available to people in the USA. Direct-to-consumer marketing of genetic testing and other laboratory services had arrived.

Cardiology and cardiac surgery

Diagnosis was increasingly aided by developments in imaging and non-invasive techniques. MRI could assess heart function, mass and volume, and detect heart infarction. Ultrasound could assess heart function. Contrast media, in association with MRI, could demonstrate infarcts. Ultra-fast CT scanning for coronary artery calcium might indicate an increased risk of a heart attack. There was a clearer understanding of the pathology of coronary arterial plaques. Better blood tests, such as the measurement of plasma myeloperoxidase, made it easier to predict risk of myocardial infarction.

A National Service Framework for Coronary Heart Disease was introduced in 2000 and outlined good practice, smoking cessation clinics, rapid access to chest pain clinics, rapid thrombolytic treatment, shorter delays for assessment and treatment, more effective use of aspirin, beta blockers and statins after a heart attack, and more coronary artery surgery. Estimation of cardiovascular risk became the starting point of therapy. Acute coronary syndrome, which could proceed to a heart attack, might be treated by early administration of a glycoprotein inhibitor. There was often dramatic improvement from the use of antiplatelet and thrombolytic agents in the treatment of coronary disease. Wald & Law suggested that heart disease and stroke could be reduced substantially by giving, on a population-wide basis, (for example, to those over 55 years), a ‘polypill’ that would reduce low-density lipoprotein, cholesterol, platelet function and blood pressure.82 Possible constituents might be aspirin, a statin, a ß blocker, a thiazide, an Angiotensin-converting-enzyme (ACE) inhibitor, an angiotensin II receptor antagonist and folic acid.

The prognosis of congestive heart failure was improved by the use of ACE inhibitors. Guidelines suggested that high blood pressure was best treated with a combination of drugs. A new class of drug, vasopeptidase inhibitors, were shown to be at least as effective in the treatment of high blood pressure, cardiovascular and ischaemic heart disease, as the existing ACE inhibitors. The evidence of benefit to patients with high blood cholesterol levels and atherosclerotic disease from the use of cholesterol lowering drugs, for example, statins, had been clear since the mid-1990s. Yet, although they were easy to take, effective and comparatively free from side effects, they remained under-used. Less than a third of patients who had a history of coronary artery disease or stroke received lipid lowering treatment. In 2003, some were made available over the counter.

Manual defibrillators had long been available but required training, for example, interpretation of electrocardiograms, restricting prompt treatment. In adults, the commonest primary arrhythmia at the onset of cardiac arrest is ventricular fibrillation or pulseless ventricular tachycardia. Survival is crucially dependent on minimising the delay before providing a counter shock. Ambulances seldom respond rapidly enough to provide defibrillation within the desirable eight minutes. This led to automated external defibrillators (AEDs) that talked the attendant through the process, and so could be used by an untrained person. They began to make their appearance in public places and proved highly effective where there were designated rescuers (such as flight attendants or security guards) or a large number of bystanders immediately available. Used within the first two minutes after collapse, they improved outcome over traditional methods. In 2002, AEDs went on public sale in the USA for about $2,500. In 2004, they became available and began to make their appearance in the UK.

Coronary artery surgery

Early in the decade, ambulances were routed past hospitals towards cardiac centres equipped to perform surgery and angioplasty 24 hours a day in some areas of the USA. In 2006 in London, six centres were nominated across the capital ambulances could go to directly when paramedics identified the patient as having had a heart attack. The two main procedures for opening up blocked coronary arteries, balloon angioplasty and open heart surgery, were rivals. Coronary artery bypass grafting (CABG) was more invasive, requiring lengthy rehabilitation. However, new techniques allowed bypass grafting on the heart while beating, reducing the morbidity associated with the pump and induced cardiac arrest.

Percutaneous coronary angioplasty only required one or two days in hospital, and patients could expect to be back at work within a week. Primary angioplasty could be used in the early stages of a heart attack and seemed to improve patient outcome. First undertaken in 1977, the procedure was now regularly undertaken. Restenosis initially occurred in over 30 per cent of patients but, with advances in stent design and improved techniques, the rates came down to 1020 per cent, comparable to the 10 per cent of vein grafts that are lost in the year after bypass grafting. Newer stents were made of metal coated with a cytostatic agent such as sirolimus or paclitaxel, released slowly and locally to reduce proliferation of smooth muscle and might reduce arterial restenosis. The long-term advantages of the two procedures remained a matter of debate, but the demand for cardiac surgery was shrinking, and the NHS was increasingly organised to provide angioplasty as an emergency procedure round the clock.

Artificial heart systems had been developed in the early 1980s and had been used in the USA and Europe, though only occasionally in the UK. As well as total heart replacements with a high stroke volume, left ventricular assist devices with small electrically powered pumps, provided a lesser stroke volume and pulsating blood flow that could be used to support the patient’s own heart and give it time to recover. The larger total replacements were used as a bridge before heart transplantation.

Some drugs used to treat disturbances of heart rhythm could themselves have dangerous side effects. Implantable cardiac defibrillators were used to treat ventricular arrhythmia; automatic cardioverter defibrillators could be life-saving and promising new developments took place in pacing technology. Particularly in younger patients, this improved substantially as did the introduction of catheter ablation, the accurate destruction of an area of the heart responsible for the disturbance of rhythm, through a catheter within the heart. Multiple leads could be inserted to synchronise contractions of the different chambers of the heart and improve heart output.

Deep vein thrombosis, long recognised as a hazard of bed rest, came to public attention following the death of a young passenger flying from Australia. Studies showed that perhaps one in a hundred passengers flying long haul developed asymptomatic thrombosis. It seemed associated with lengthy flights, alcohol consumption, and failure to move about the cabin.

Organ transplantation

The results from organ transplantation steadily improved with advances in immunosuppressive therapy. From the early 1980s onwards, drugs such as cyclosporin and tacrolimus had made modern transplantation care possible. Mycophenolate mofetil reduced the risk of acute rejection and monoclonal antibodies were also increasingly used. In 1969, less than half with kidney transplants survived a year. Now 95 per cent had a functional kidney after a year, and most would probably work for 20 years. Some 3,000 new cases needed a transplant annually, and some 3,000 were transplanted. However, there were 8,000 on the waiting list and roughly three people died daily waiting for transplantation. Some centres began to retrieve organs from non-heart-beating donors as well as conventional brain-dead donors. In the USA, patients were increasingly urged to look for a live donor among their friends and relatives where a kidney (or part of the liver) might be given. Transplant survival was better and the wait for transplantation shorter. As a result of the increased use of seat belts and the better treatment of subarachnoid haemorrhage and strokes, the number of organs available for transplantation fell throughout the decade. Donors had to be “fit but dead” and most potential donors were to be found in critical care units – perhaps only 2,500 people a year were suitable. In 1999, the BMA voted in favour of an ‘opt-out’ system so that the organs were automatically available unless there was a written statement to the contrary. Transplant surgeons, however, preferred the existing ‘opt-in’ approach, fearing public reaction to a more radical policy. In 2006, Ministers established the UK-wide Organ Donation Taskforce, chaired by Elisabeth Buggins, to identify barriers to organ donation and make recommendations. Reporting in 2008, Organs for Transplants, the Taskforce looked at the pathway from potential donor death to transplantation and the obstructions.83 Its recommendations aimed to make an unusual procedure usual, and to encourage location action to solve a national problem. It sought a National Donation Organisation within NHS Blood & Transplant. . It suggested appointing 100 additional transplant co-ordinators to increase donation, removing financial disincentives from hospitals whose own patients would not, in fact, benefit, Trust Donation Committees to encourage local action, and pursuing the possibility of an opt-out system.

Heart transplantation survival rates improved, especially in the first year after the transplant: about 88 per cent surviving, and 72 per cent surviving for five years. Since 1995, there had been a significant fall in the number of patients receiving new hearts, lungs, or heart and lungs through transplantation. In 2000, 265 patients in the UK were treated, and there were six heart and lung transplant centres in England (Birmingham, Cambridge, London, Manchester, Newcastle and Sheffield). Did centres do enough operations to maintain expertise? A National Specialist Commissioning Advisory Group took responsibilities for commissioning heart and lung transplants for both adults and children.

The techniques necessary for the reattachment of limbs had already been developed and, in 1998, the first hand transplant was carried out in France; a second – more successful – was undertaken the following year in the USA. The first face transplant was also undertaken in France in 2005.

Stem-cell transplantation was revolutionising the outcome of a range of malignant and non-malignant blood disorders, including immunological diseases. Stem cells could be obtained from blood, bone marrow and umbilical cord blood. Cord blood banks were established in London, Bristol, Belfast and Newcastle to collect, preserve and type blood products, and to test for viral contamination.

Anaesthesia

Advances in surgery have partly depended on the steady advance of anaesthesia. Better drugs and better equipment have improved the speed of recovery and made the surgeon’s task easier.84 Since 1948, the simple Boyle’s machine carrying cylinders of nitrous oxide, oxygen and often cyclopropane, with its lever-controlled vaporisers for ether, had been replaced by more complex equipment with calibrated vaporisers coupled to central gas supplies. Chloroform and ether were superseded by halothane, enflurane and sevoflurane. Intravenous barbiturates such as thiopentone have given way to short action drugs such as propofol (1986) with a significantly lower incidence of nausea and vomiting, and post-operative drowsiness. Endotracheal intubation was largely replaced by the laryngeal mask airway. Neuromuscular blockade with curare or suxamethonium made possible light levels of anaesthesia and rapid recovery. Local and regional anaesthetic techniques improved with the introduction of more reliable drugs. Monitoring of inspired and expired gasses was easily available, and blood oxygen levels were routinely monitored.

Surgery

Fast-track and minimal access surgery

Better anaesthesia, minimally invasive techniques, optimal pain control and aggressive post-operative rehabilitation were reducing the patients’ responses to stress, shortening the recovery time. Earlier discharge was possible and fast-track surgical units extended the long-standing achievements of day surgery. Operations such as splenectomy, vaginal hysterectomy and mastectomy were becoming possible on a day or 24-hour stay basis. Purpose-designed fast-track surgical units appeared and were one way to reduce lengthy waiting lists for treatment. Minimally invasive surgery was facilitated by improvements in miniature video cameras producing good images so the operator and the assistant could work together. Virtual reality simulators became available for training. Laparoscopic cholecystectomy was becoming the technique of choice.

Operative procedures were now considered by NICE. Reassessment revealed the complication rate of minimal access surgery and the techniques of open operations were improving, as in the case of inguinal hernia. NICE reviewed minimal access surgery for hernia (100,000 cases each year) and recommended in 2001 that people with first-time inguinal hernias should have ordinary (open) surgery rather than a minimal access procedure. Such centrally formulated guidelines were increasingly affecting surgeons’ decisions.

Emergency care

The rising number of attendances at A&E, changes in GP out-of-hours arrangements and the long-standing view that services were badly organised, led to a review – Emergency Access: Clinical Case for Change – by Professor Sir George Alberti, National Director for Emergency Access 85 and the start of the process of reconfiguring local services. 

A target was established and achieved for the percentage of patients seen within 4 hours, and systems were reviewed to try to reduce the time taken by each stage of a patient’s visit.

The early doubts about helicopter ambulance services had now been stilled and most areas had a scheme, often funded on a voluntary basis. Sometimes, as in London, the main use was to get doctors to cases of trauma rather than evacuating patients.

Virtual reality/robotic surgery

Developed in the 1980s and 1990s, virtual reality and robotics were coming to fruition. The United States Department of Defence had developed telepresence surgery to meet battlefield demands and the da Vinci® Surgical System evolved from these efforts (1999). A main ambition was to apply it to heart surgery; but the primary use became radical excision of the prostate, sparing the nerves on which urination and potency depended. This proved a profitable use and the number of centres providing robotic surgery in the USA mushroomed. The surgeon sat at a computer console, viewed a three-dimensional virtual operative field, and performed the operation by controlling robotic arms that held the stereoscopic video telescope and surgical instruments that simulate hand motions, permitting smaller incisions and tremor-free operating. ‘Master-slave’ robotic procedures were also used for minimally invasive coronary artery bypass grafting and laparoscopic surgery. Neurosurgeons used image-guided surgery and augmented reality.

Orthopaedics and trauma

Military medicine and trauma

A US surgeon general said that war is an efficient schoolmaster. The conflicts in Iraq and Afghanistan led to major advances in trauma care. Injured soldiers obtained care of which the normal NHS patient could only dream. A military wing at the NHS hospital Selly Oak, Birmingham, took injuries but consultant care was now available in the front line – “we are projecting the emergency department forward into the helicopter.” An engineered process from early transfusion at the beginning of the chain, to the weekly clinical audit sessions in Birmingham, led to better survival, including many with the gravest injuries who at any other time in history would have died. Improved understanding of the way in which the body responded to major trauma and severe multiple injuries led to the introduction of new methods of managing them. New drugs were introduced to reduce the likelihood of multiple organ failure in the weeks after injury.

Orthopaedics looked set for advance as a result of new materials, computer-aided manufacturing technology and molecular biology. The improvement of imaging had a substantial impact on orthopaedic practice. Musculo-skeletal imaging – for example, of the knee joint – made greater accuracy possible in the assessment of suspected cartilage and ligament injuries, often substantially altering treatment.

Roughly 50,000 hip arthroplasties were performed annually, mainly for osteoarthritis. Younger patients, leading an active life, were likely to wear out the replacement hip. In the USA, minimally invasive hip arthroplasty was introduced, using a 3” incision and specially lit instruments, reducing pain, the time spent in hospital, and the speed of recovery. Earlier, knee replacement was widely considered a poor operation. By the 1990s, the basic principles of successful surgery had evolved. The joint would be resurfaced, reproducing the normal anatomy with a low-friction joint, the remaining ligaments providing stability. Some 35,000 operations were now performed per year, with about a 90 per cent success at ten years.

Femoral fractures in children had traditionally been treated with traction and hospital stays of 4 to 12 weeks. A new technique, flexible nailing of the femur, developed in Switzerland, allowed early mobilisation.

Cosmetic surgery

Cosmetic surgery was a growth industry – in some circles ,an obsession. In 2001, over 13,400 patients were registered as having had a breast implant, mostly for cosmetic augmentation (77 per cent). The private sector mushroomed. In 2005, the Healthcare Commission took action to regulate and monitor those providing such services. The use of botulinum toxin (botox) to reduce wrinkles was popular. Boots planned to provide facilities in some of its larger stores. Collagen fillers were widely used and sugar molecule injections looked set to become the next hot cosmetic item.

Bariatric surgery

Around 25 per cent of adults in England were now considered obese with a body mass index (BMI) of 30 or greater. Based on surgical techniques originating in the 1950s, bariatric surgery is the reduction of the capacity of the stomach in an attempt to treat obesity that was life-threatening and had not responded to simpler dietetic measures. The surgery had increased in popularity in recent years worldwide, and it increased exponentially in England. Gastric-banding and gastric bypass were the commonest procedures, the mortality being about 0.8 per cent. Increasingly, laparoscopic procedures were used. It was effective in leading to weight reduction in most people and reduced the risk of obesity-related illnesses.

Neurology and neurosurgery

Many new drugs were being introduced for epilepsy, not all of which were major advances. For temporal lobe epilepsy, surgery emerged as the most effective treatment. Costly drugs were controversial – and only had a slight effect on diseases otherwise difficult to treat, such as the interferons in multiple sclerosis and drugs in dementia.

Some rare neurological diseases had long been known to have a genetic cause; now gene mutations were discovered that increased the risk of developing Alzheimer’s disease. There was hope that the identification of causal factors at a molecular level would open the way to treatment to influence the course of the disease. As brain cells did not appear to regenerate, it would be important to develop an early diagnostic system, so that treatment, when available, could begin as soon as possible.

The newer systems of imaging, for example, PET, sometimes gave indications of how a neurological disease was developing, and how brain function was affected. MRI could be used to display blood vessels, including the carotid arteries. Frameless stereotaxy linked information from CT or MRI scans to sensors to position the images. Stereotaxic systems helped the surgeon to navigate safely through high-risk areas of the skull and brain, knowing exactly where the surgical instruments were. Interventional MRI provided enough space within the scanner for the patient to move and for some neurosurgical procedures to be carried out.

Stroke

Stroke remained one of the commonest causes of admission. The outlook was improved by very early thrombolytic therapy if intracranial bleeding could be ruled out by an immediate CT scan. Most district general hospitals did not have a 24-hour on-call neurological and scanning service. A National Stroke Strategy stressed the need for immediate admission of those with strokes or transient ischaemic attacks to a unit that could provide brain imaging on arrival, 24 hours a day.86 People had to recognise early signs of a stroke which needed to be treated as a medical emergency. The development of stroke care networks and redesign of services across networks to ensure appropriate urgent care took place fast – some SHAs, for example Oxford, Newcastle and London, responded effectively.

Ophthalmology

Progress in ophthalmology was steady. Foldable intraocular lenses could be inserted through a small self-sealing incision. Better drugs became available for local treatment of glaucoma and endoscopic laser techniques also helped in its treatment.

A new treatment became available for age-related macular degeneration, a common cause of blindness. This involved monoclonal antibodies, anti-VEGF (vascular endothelial growth factor) therapies, a sub-family of growth factors. These included bevacizumab (Avastin), antibody derivatives such as ranibizumab (Lucentis), or orally-available small molecules that inhibit the tyrosine kinases stimulated by VEGF : sunitinib (Sutent), sorafenib (Nexavar), axitinib, and pazopanib. NICE reviewed the evidence for their efficacy and agreed that they should be used.

Cancer

Worldwide, the occurrence of cancer steadily increased – populations were often older and other causes of mortality were being attacked. Many cancers were related to diet, but precisely to what dietary habits in different countries was unclear. Infections, for example, Hepatitis B and Helicobacter, were also responsible. Genetic factors were being discovered. Some families had a very high incidence of particular cancers and specific gene faults could be identified. Survival from cancer in the UK was lower than in many European countries and the USA. The UK used chemotherapy less, access to diagnostic services and staffing levels were poorer, there were fewer oncologists and 40 per cent of cancer patients never saw one. A major problem was undetected spread at the time of first treatment and the difficulty of reducing the delay from referral or diagnosis to first treatment, particularly in radiotherapy. A National Cancer Plan (2000) aimed to improve coverage, staffing levels and service organisation.87

Cancer networks were established and effort went into re-engineering the clinical pathways. Government pledged that patients with suspected cancer would be seen within two weeks of referral by a GP. However, CHI and the Audit Commission found that standards of care were variable and too few patients, particularly when admitted as an emergency, saw a cancer specialist.88 The Calman-Hine Report89 was, in many places, far from implementation. In 2007 a new plan, the NHS Cancer Reform Strategy, was launched.90 Key elements included focus on prevention, spending more on radiotherapy equipment, faster treatment, extended screening, fast-track drug approval and extended services for the increasing numbers of people surviving cancer.

Slow improvement in survival was taking place, particularly for cancer of the breast, colorectal cancer, non-Hodgkin’s lymphoma and leukaemias. For men in early middle age, the prevalence of smoking had halved between 1950 and 1990 and the death rate for lung cancer at ages 35–54 fell even more rapidly. However, women and older men who were smokers had higher rates in 1990 than in 1950. In 2000, the deaths of women from cancer of the lung exceeded those from cancer of the breast for the first time. Deaths from cancer of the breast fell 21 per cent between 1990–1998, perhaps because of screening, earlier diagnosis, better treatment, including the addition of chemotherapy and hormonal treatment, for example, Tamoxifen. It was hard to know which had been chiefly responsible.

Surgical treatment for cancer improved and was increasingly conservative, retaining organs and structures where possible. Radiotherapy centres were increasingly well-equipped and tomographic treatment machines gave a more accurate dosage. Linear accelerators and multi-leaf collimators allowed radiation doses to be delivered to the precise shape of the tumour and helped to reduce the volume of treatment irradiated, sparing normal tissue around the cancer. Precision therapy greatly increased the staffing requirement and there was a shortage of radiographers. A new advance in radiotherapy, Continuous Hyperfractionated Accelerated Radiotherapy (CHART) was introduced. Radiotherapy was given for 12 successive days, including weekends, three times each day, with a 6-hour gap between treatments. The total dose was higher, and there might be side effects, but in lung cancer treatment the cure rate seemed better. Further ahead was the possibility of charged particle beam therapy consisting of protons or carbon ions produced in a synchrotron or cyclotron, the first facility being at Clatterbridge (Wirral). Bone marrow transplantation remained central for some cancers, and the cure rate of acute leukemia in children improved.

Pharmaceutical research moved from new cytotoxics to drugs acting on defined molecular mechanisms. The dramatic successes achieved in some rarer cancers were not repeated in the commoner ones – breast, lung or colon. Controlled trials improved the results of chemotherapy as new agents were introduced, as for example in colorectal cancer and cancer of the breast (anastrozole and letrozole). In hormone-dependent breast cancer, five years of post-operative tamoxifen therapy prolonged disease-free and overall survival. The aromatase inhibitor exemestane, by suppressing estrogen production, was found to improve the outcome after two or three years of tamoxifen therapy.

Some of the newer drugs were used mainly to extend the survival of people with terminal cancer. Karol Sikora, from Imperial College, predicted that, within 20 years, cancer would be considered a chronic disease to be controlled, such as diabetes and asthma. The new treatments would be more selective, les toxic and given for long periods, perhaps lifelong. The NHS would have difficulty in paying for everything available. When their effectiveness became apparent, they might be used earlier, as in the case of paclitaxel (Taxol) in ovarian cancer. Oxaliplatin (Eloxatin) and irinotecan (Campto) were approved for metastatic colorectal cancer. Three drugs for small-cell lung cancer were also approved – gemcitabine, paclitarel and vinorelbine.

Monoclonal antibodies, after many years, began to live up to some of the expectations. There was rituximab (Mabthera) for low-grade non-Hodgkin lymphoma. The antibody attached to the B-cell surface receptor, present in most cases. Another was trastuzumab (Herceptin), which targeted a protein on the surface of fast-growing cancer cells, and was accepted for advanced breast cancer in 2002. Given much earlier to people with aggressive breast cancer carrying the HER-2 gene, it achieved a major reduction in recurrence rate. There were many monoclonals in the pipeline, each active against a receptor on a malignant cell surface. Another group of drugs active against cancer were anti-angiogenesis drugs, which kept tumours from developing good blood supplies.

Obstetrics and gynaecology

There was an ever-increasing demand for fertility treatment, often in women who had delayed pregnancy until their mid-30s, and births steadily increased. In 2007, 36,000 women received treatment, and there were more than 11,500 births from in-vitro fertilisation (IVF) or intra-cytoplasmic sperm injection (some multiple births). NHS resources being limited, most of this was undertaken in the private sector at personal cost. More than one embryo was often implanted to increase the success rate, and the number of multiple births in older women increased rapidly. Many babies were of low weight, and delivered early, increasing NHS costs. The pressure on the NHS to make fertility treatment part of mainstream NHS services was successful – NICE recommended that three cycles of treatment should be available to women aged 23–29 years of age (HFEA figures).

The 6th Report of the Confidential Enquiry into Maternal Deaths (CEMD) in the UK was published in 2004.91 Maternal death is a rare event – 391 being reported from 2000–2002. The main causes of death were thrombo-embolism, haemorrhage and anaesthesia. However, sepsis was staging a comeback, and the largest cause of death overall was psychiatric, generally depression. Women in families affected by unemployment, who lived in deprived areas, or were from minority ethnic groups were at substantially higher risk of death than others. To reflect the rarity of maternal death and the linkage of the health of the mother and the child, CEMD and the Confidential Enquiry into Stillbirths and Infant Deaths (CESID) merged to form the Confidential Enquiry into Maternal and Child Health.

The 7th Report in 2007, covering 2002–2005 showed that, over 20 years, there had been little improvement in the figures.92 Approximately half of the deaths directly or indirectly associated with childbirth occurred in women who were overweight or very obese, mothers who were older and often from disadvantaged communities, including immigrants and asylum seekers. Some of the avoidable deaths were the result of doctors failing to manage medical conditions effectively that were outside their normal expertise, for example, heart disease. Black African women, including asylum seekers and newly arrived refugees, had a mortality rate nearly six times higher than white women. To a lesser extent, black Caribbean and Middle Eastern women also had a significantly higher mortality rate.

The nature of antenatal care – and making it more sensitive to the parents’ desires – was the subject of several reports, starting with Changing Childbirth in 1993 with a vision of midwifery-led, woman-centred care in which women could choose where to give birth. There were plenty of pilots but no systematic change. In 2007 came Maternity Matters,93 which set out the wider choice framework for maternity services, including a guarantee (within safety limits) of choice over where to give birth, how to give birth, and what pain relief to use.

The pattern of antenatal consultations and care had remained much the same since the time its importance was first appreciated. In 2003 and 2008, NICE recommended changes suggesting that care should begin earlier – around eight weeks – to give women more time to make decisions about screening, and to plan the kind of antenatal care that they would like. Ultrasonography should be carried out at 10–13 weeks, to increase the accuracy of pregnancy staging. Healthy women pregnant for the first time should typically be offered ten appointments, and seven in subsequent pregnancies. All women should be offered screening for fetal Down syndrome during their pregnancy. To safeguard the health of the baby, networks of neonatal units were established so that women in labour could be transferred when desirable to a unit providing the most intensive neonatal care. During delivery, continual monitoring of the fetus became near universal. Defensive obstetrics was important because hospitals needed evidence of good care in case of future legal actions.

Many obstetricians believed the caesarean section rate to be too high; a survey by the Royal College of Obstetricians and Gynaecologists placed it at 22 per cent. Concern about injuries to the pelvic floor was among the reasons for rising caesarean section rates.

The abortion rate had been rising since records began. It was 14.8 per 1,000 women in 1988, but 18.2 per 1,000 in 2007. The rate for under-16s was 4.2 and for under 18s it was 18.9. The highest abortion rate was seen among 19-year-olds, with 36 per 1,000 women. By 2007, almost 200,000 terminations were being performed annually. From January 2001, the ‘morning after’ pill became available over the pharmacist’s counter rather than solely through medical channels. A third of abortions were carried out on women who had already had one.

Clinical genetics was increasingly applied to diagnosis and treatment. Pre-implantation genetic diagnosis developed apace. The ability to screen in-vitro, for the inherited Fanconi syndrome enabled a clinical tour de force – the selection of one among a number of embryos for implantation that was unaffected by the disease. The HFEA also decided to allow selection of an embryo so that a baby could become a donor for a brother seriously ill – a ‘saviour’ sibling. Similarly, women in families predisposed to cancer of the breast could be assessed so that only embryos free from the significant gene were implanted.

Gynaecology

Improvements in imaging, endoscopy and drug treatment all contributed to steady advance. Fibreoptic endoscopes enabled the replacement of some major operations by minimally invasive procedures. Ectopic pregnancy could be diagnosed early by ultrasonics and treated by laparoscopic surgery. New approaches to the treatment of heavy menstrual loss were developed, for example, endoscopic ablation of the endometrium by laser. Simpler methods of treating endometrial polyps were also possible. Better understanding of the risk and frequency of incontinence after delivery, and the technique of the repair of tears, offered more hope to women suffering such embarrassment.

Problems with reading cervical cytology slides emerged from time to time. Even in units recognised to be of high quality, the inherent difficulty in reading test results led to audits that showed repeated failures to identify abnormal smears.

Despite decades of accumulated observational evidence, the balance of risks and benefits for hormone replacement therapy (HRT) in healthy postmenopausal women remained uncertain. A US trial94 of a combined oestrogen/progesterone preparation was stopped in May 2002 after an average 5.2-year follow-up among healthy postmenopausal US women, because risks (for example, invasive breast cancer, heart disease and stroke) exceeded benefits, for example, a reduction in fractures). More than 2 million women were reported to be taking HRT in Britain and a Medical Research Council (MRC) trial of long-duration oestrogen after menopause had been under way since 1996. It was decided in November 2002 to bring this trial to an end as well. The results of the ‘Million Women Study’95 confirmed the increased incidence and death from breast cancer, particularly for oestrogen-progestagen combinations, leading to calls for its prescription to cease. HRT might be responsible for an additional 5,000 cases of breast cancer in the country annually.96 In 2006, it was reported from the USA that the number of cases of breast cancer was falling since many women had stopped taking HRT. The uncertainty, however, remained and later, in 2017, it was reported that there was no increase in all-cause mortality in those taking HRT. It became increasingly clear that, for many years, the pharmaceutical industry had quietly encouraged the use of their products by providing financial support to meetings that were essentially PR, and organisations encouraging the use of HRT, some of which had apparently been established by groups of oestrogen product manufacturers.

Paediatrics

Children’s services had been reviewed over the years,97. A Children’s Task Force was established in 2000 to lead the development of a national service framework to improve the “fragmented and poorly co-ordinated” services that children might receive (2004).98 Professor Al Aynsley-Green’s framework reiterated principles of care expounded since Platt. Children were different, needed to be looked after by people who understood their particular needs, and should have services designed specifically for them. Accessible and age-appropriate services were needed, earlier diagnosis and intervention, and a smoother transition from child to adult services. There must be someone at senior managerial level in every NHS organisation who took responsibility for ensuring that children’s voices were heard. All newborn babies should have access to the most appropriate care where and when they needed it, and there should be regional networks of care to minimise the need for transfer.

The increasing ability to diagnose fetal defects by antenatal ultrasonic scans, and the emergence of feto-maternal medicine as a specialty, meant more work for paediatric surgeons. Spina bifida, kidney and bladder diseases might be diagnosed before birth. Surgery immediately after birth could be planned, and where the defect might result in the death of the fetus or neonate, fetal surgery was sometimes possible endoscopically.

Organ transplantation in children presented difficulties, as the drugs used to suppress immune reactions might affect growth. Yet the outlook of children dying of liver, kidney or heart failure was revolutionised. Liver transplantation was extended to the neonatal age group. Congenital heart disease and cardiomyopathy could be treated by heart transplantation.

Geriatrics

The elderly were now usually admitted to acute wards, but the care they received was still sometimes lacking. The Health Advisory Service reported that fabric and design of the wards was often poor, equipment might be unavailable, ward routine might be inflexible, and patients might not be helped to eat or drink.99 The Health Advisory Service preferred specialised facilities for the elderly rather than those integrated with other facilities. It asked for national standards, older people should be helped to eat and drink, lie in a clean, dry bed, and be treated with respect. In 2001, the Standing Nursing and Midwifery Advisory Committee’s Report Caring for Older People: A Nursing Priority was published.100 The report said:

There is a great deal of evidence to support the conclusion that the care that older people receive often fails to meet their most basic needs for food, fluid, rest, activity and elimination and the psychological and mental health needs of older people are often entirely neglected in acute health care settings. The nursing care of older patients is mainly deficient in terms of fundamental skills, such as communication and helping a patient to maintain their nutritional status, skin integrity and continence.

In March 2001, the government published a National Service Framework for Older People as a ten-year programme setting national standards covering ‘age discrimination’, person-centred care, hospital care, specialised stroke services, falls, mental health and the promotion of an active healthy life.101

Labour, in opposition, had accused the Conservative government of forcing thousands of pensioners to sell their homes to pay for long-term care. The Royal Commission on Long-Term Care for the Elderly (1999) proposed making all nursing and personal care, including help with washing and dressing, free to all needing it. Labour was now cool towards this solution, with its immense costs.102 The problem was at least as great for the social services that were often responsible for the costs after discharge. Private residential and nursing homes began to close because payments did not cover the cost of providing a staff-intensive service, and meeting newly introduced standards of accommodation. Hospital Trusts reported that it was difficult to discharge older patients requiring care. 

Mental illness

A new vision was emerging in the mental health services. Two big ideas were that patients with mental illness should be treated in the community, and that young people with early psychosis should receive timely and comprehensive intervention at the outset. `The National Service Framework for Mental Health (1999) followed pledges in the NHS Plan and, increasingly, money flowed into mental health services.103 The key areas of the National Service Framework included phasing out mixed-sex psychiatric accommodation, the elimination of out-of-area acute admissions as soon as possible, 24-hour access to mental health services for patients and carers, new training for psychiatrists in ‘cultural awareness’, and a drive to encourage more nurse consultants in the mental health area, particularly to work with people with mental ill-health and drug and alcohol problems. Over several decades, large mental health hospitals had been closing.

As patients were discharged into the community, some old mental hospitals such as Friern Barnet (North London), St Francis (Haywards Heath), Oakwood (Maidstone) and The Royal Holloway (Virginia Water) were converted into high-quality living accommodation for the well-to-do. They were replaced by community psychiatric teams, generally with too few staff, yet with wide responsibilities for cases of many different types. This ‘generic’ approach, where teams were responsible for all the problems in their community, often proved inadequate. Increasingly, teams with particular skills and smaller caseloads were formed to provide specialised services, for example, crisis and early intervention services (to reduce the need for admission), personality disorder and prison in-reach. Crisis Resolution Home Treatment (CRHT) teams helped people through short-term mental health crises by providing intensive treatment and support outside hospital, ideally at home. Made up chiefly of mental health nurses, with additional input from consultant psychiatrists, social workers, occupational therapists and psychologists, they could denude the community health teams who were left to handle large caseloads of people whose problems did not fall into specific categories.104 The National Audit Office reported in December 2007 that the introduction of CRHT teams was associated with reduced pressure on beds, and the teams were successfully reaching service users who would otherwise probably have needed admission. CRHT teams were also supporting the earlier discharge of people from inpatient treatment – for example, in around 40 per cent of the discharges in the National Audit Office sample.

Organisational change had a substantial impact on psychiatry and community care. Changing boundaries of the Mental Health Trusts made the development of teamwork with social work services difficult. Yet care in the community as a policy only produced new problems. It worked for some people; it let down many others for whom it might mean fighting for mental and physical survival alone in flats and bedsits, or with their families who broke down under the strain. There was a famine of community psychiatric nurses and there needed to be enough hospital places, places of asylum in the true sense of the word. The Mental Health Act 1983 was reviewed and Modernising Mental Health Services105 proposed a strategy with two essential elements:

  • Increased investment to provide more beds, outreach facilities and 24-hour access and new treatments
  • Increased control of patients to ensure compliance with treatment in the community, and a new form of revisable detention for those with a severe personality disorder.

In 2001 a White Paper, Reforming the Mental Health Act106 aimed to deal with concern that had led to the release of hundreds of patients, some of whom did not receive care, and others becoming a risk to themselves and the community. Such releases had contributed to 1,000 suicides and 40 murders a year. Finally a new Mental Health Act was passed in 2007, which allowed people with serious personality disorders to be detained, even if they had committed no crime, if they were a danger to themselves or others. It also allowed compulsory treatment in the community under certain circumstances.

In drug therapy new and ‘atypical’ antipsychotics were introduced, such as amisulpride, olanzapine, quetiapine, risperidone and zotepine, that were said to produce fewer extrapyramidal side effects and reduce the suicide rates, compared with drugs such as chlorpromazine. They were many times more expensive, but there was public and professional pressure for their widespread use. It was 20 years before careful examination of the clinical trial evidence showed major flaws, for example, comparing the new drugs with haloperidol, a drug known to have many serious side effects. Marketing and publicity had overtaken science. Though the evidence for their cost-effectiveness of these ‘second generation drugs’ was equivocal, in 2002 NICE approved their use, both for patients with side effects from the traditional drugs, and as a first treatment.

The extent to which ‘street drugs’ such as crack cocaine were often taken by patients with recognised psychiatric problems was an increasing hazard. 

Learning disability

Hospitals for those with learning disabilities had been the centre of scandals in the 1960s/70s. In 2007, the Health Care Commission found that Orchard Hill Hospital in Sutton and Merton was “in a time warp” and providing care at an unacceptable level, describing:

 some of the environments in which people lived as impoverished and completely unsatisfactory. Staff were not properly trained or supported to provide an acceptable level of care, and inadequate levels of staff meant that people were often left day in day out with little to occupy their time. There were failures in management and leadership at all levels, from managers to the trust’s board.

The Commission decided to inspect some 200 similar institutions.

General practice and primary health care

The workload of primary health care was rising. Between 1995 and 2008, the number of patient consultations in practices rose by 75 per cent, from 171 million to more than 300 million. GP consultations rose by 11 per cent (3.0 consultations per patient-year in 1995 and 3.4 in 2008), and nurse consultations by nearly 150 per cent (0.8 consultations per patient-year in 1995 to 1.9 in 2008). For the average patient, the number per year rose from 3.9 in 1995 to 5.5 in 2008, with the biggest increases taking place amongst those aged over 70 years.

The concept of the GP as the sole entry point to primary care and ideally to the hospital system was breaking down. A White Paper, Building on the Best, suggested that commuters might have a GP near their work as well as near their home.107 Pharmacists and nurses were now competitive providers. Government became eager to ensure that new practices were opened where they were needed, and newer contractual patterns such as Alternative Providers were devised.

Walk-in centres

Walk-in centres were piloted in stations and shopping malls, some near major hospitals. By 2004, 87 were in operation or planned, and private sector organisations such as Care UK obtained contracts to run some. Nurse-led minor injuries clinics and triage points tended to see patients of working age during the working day, when people were away from their normal GP, at work or on holiday. They offered simple advice and treatment for minor health problems using computer-based decision support software. The most common reasons for attendance were minor viral illnesses, unprotected sexual intercourse (emergency contraception) and minor injuries. Most could be handled without onward referral and they did not greatly affect the workload of local GPs. Opening hours were wide, (usually 7am to 10pm every day). The average consultation length was 14 minutes.108

Pharmacists

A new pharmacists’ contract in 2004 encouraged pharmacists to expand their role into chronic disease management, supervision of repeat prescriptions, smoking cessation and other appropriate services. A White Paper in 2008, Pharmacy in England, Building on Strengths – Delivering the Future109 heralded further expansion of the role of pharmacists.

GP-led Health Centres

PCTs were instructed to develop new GP-led health centres to provide better access. They opened from 8am to 8pm 7 days a week, providing access to anyone who turned up, whether registered or not, but also registering patients like a normal practice.

NHS Direct

Department of Health negotiators had, at times, confronted the BMA about accessibility of family doctors compared with banks such as First Direct. In 1997, the head of operational research in the Department of Health asked “what would an NHS look like that was radically reconfigured so that demand could be handled by direct means?” (such the telephone, TV and Internet). The subsequent establishment of NHS Direct in March 1998 was part of a rapidly developing mosaic of first points-of-contact for health care. High on Labour’s priorities, this 24-hour telephone triage system, operated by nurses, advised callers on the most appropriate form of care. Sophisticated computer-based software helped the nurses, reducing the possibility of dangerous errors, and became nationwide in 2000. Call centres might be based with ambulance services that had much of the necessary infrastructure. NHS Direct responded to consumerism, doing what cash machines had done for banking, to offer a more accessible, convenient and interactive gateway. The commonest reasons for calls were rashes, abdominal pain, dental, tooth and jaw pain, and medicines advice. The website was expanded to incorporate a health encyclopaedia about common conditions. Surveys showed substantial user satisfaction, although only 64 per cent of callers managed to speak to a nurse within five minutes and one in five callers had to wait more than 30 minutes for a nurse to call them back. It was no cheaper than GP consultations and many patients were referred. There was no evidence about whether NHS Direct made more or fewer mistakes than a GP would have done. A National Audit Office report stated that, on the evidence, NHS Direct was operating safely, and advice to callers erred on the side of caution; there were 29 adverse event cases in three years, fewer than one for every 220,000 calls. NHS Direct undertook continuous audit. Government aimed to make it a single access point for out-of-hours care and, in 2004, it was established as a Special Health Authority. By the end of the decade, the service was answering 25,000 calls a day at 35 call centres, and employing 3,000 people.

Practice premises

As part of the NHS Plan110 government introduced a new way to fund premises – the local improvement finance programme (LIFT) in which primary care providers would collaborate with private investors to build facilities. Most were owned by the private developer, but some were developed with public money and money from the GPs if they wished. By 2002 a total of 42 such centres had been announced, and the first such a centre (Manor Park), costing £4.9 million, was opened in 2004 in Newham. It contained three general practices, health visitors, accommodation for dentists in the future, a pharmacy, a cardiology clinic, X-ray, pathology and optometry services, and a healthy living café. Particularly in areas where there was a shortage of GPs, government began to look to the private providers. In May 2006, Care (UK) opened primary care facilities in Barking and Dagenham, family doctor services being provided to the NHS by the private sector.

Nurse substitution and GPs with Special Interests (GPwSI)

Partly driven by the problems of GP recruitment in February 2002, the BMA’s Health Policy and Economic Research Unit proposed the greater use of nurses (themselves in short supply). Nurses were becoming increasingly important in the provision of primary health care. Government considered that ‘community matrons’ might remove the need for the patient to attend the GP surgery or outpatients or stay in an acute bed unnecessarily. More efficient use of doctors’ time might allow them to devote more time to tasks that required advanced levels of clinical training and specialisation. Full nurse-substitution would require a pattern of nurse training that included the complex problems of the diagnosis of ill-defined and newly presenting problems. Both general practitioners and government were coming to view the role of primary care differently. The NHS Plan seemed to suggest that much that the GP did could be done by nurses, and GPs might become intermediate level specialists. General practitioners and nurse practitioners were increasingly specialising in particular aspects of clinical management, education or research.111 GPs were taking on tasks previously more commonly the responsibility of secondary care, providing such services for patients outside their own practice, and being paid specifically for them. The Royal College of General Practitioners (RCGP) and the Modernisation Agency, approved of the policy of accrediting GPwSIs, who might provide a service that could avoid hospital referral. By 2003, there were more than 1,250 GPwSIs in specialties such as dermatology, ENT and cardiology. Often they worked in fields where there was a long waiting list to see a consultant. The additional equipment they needed was extensive and expensive, costing anything from £10,000 to £40,000, and the cost-effectiveness was open to question. Rapid access to hospital facilities for complex investigation was also necessary. Several models were developed, fully independent from secondary care, services with close hospital support, services with consultant triage, or fully based in the hospital. A measure of specialisation appealed to some GPs but such services were an addition to, rather than substitution for, secondary care. 

24-hour responsibility

Traditionally GPs had undertaken evening home visits to the home and emergency night calls. Continuity of care had been a core value. With the introduction of deputising services and GP co-operatives in the 1960s, up to 90 per cent of GPs had devolved care. Patients wanted 24-hour access, over half the entrants were women, and GPs increasingly opposed a 24-hour contract. But the GP’s contract was for 24-hour patient care and pay had been on that basis. In 2000, John Denham, the Health Minister, accepted that out-of-hours primary care would move from the GP to NHS Direct, and the members of the BMA English GP Committee voted to end their 24-hour legal responsibility, forcing a review of this principle. A new contract in 2003 gave GPs the right to opt out and transfer their out-of-hours responsibilities to an “accredited organised provider of out-of-hours services”, subject to the approval of their PCT.. Virtually all did, and PCTs became responsible. The cost escalated, and in England was £392 million in 2005, 22 per cent higher than predicted. Deputies were scarce and some PCTs had to fly doctors in from Europe. Concern about a deterioration in services became a political issue and Gordon Brown, the Prime Minister, wanted GPs to provide more services for more of the day. GPs did lengthen their surgery hours, but did not resume 24-hour responsibility.

Our Health, Our Care, Our Say

Our Health, Our Care, Our Say: A New Direction for Community Services112 was published by the Department in 2006, stressing patients’ independence, wellbeing and choice, and the importance of community services. In 2007, the RCGP published a roadmap on the Future of General Practice agreeing that GPs were no longer the main entry point to the NHS, and were not working in a single model of practice organisation. 113 Perhaps they should become ‘federated’ into larger groups to provide a wider range of services (such as scans and X-rays) to cover most health problems in the population, (including mental health), closer to patients’ homes. Hospitals should be reserved for acute illness, specialised investigations and major surgery. Then the Department of Health published NHS Next Stage Review: Our Vision for Primary and Community Care114 with an emphasis on prevention and a vascular risk assessment programme for those between 40–74 years, a Quality and Outcomes Framework for pay, and increasing choice for patients.

General practice was beginning to be restructured. The idea of polyclinics re-emerged, either very large centres or by a hub-and-spoke pattern with central facilities used by local practices. Darzi, reporting on health services in London, pressed for polyclinics, providing walk-in services as well as those for registered patients, with specialist and investigative support.115 He wanted 100 new practices in areas with poor provision, delivered by a wider range of providers, some perhaps in the private sector. Government policy aimed for a fundamental shift of care from hospitals to more community-based settings. There was a highly politicised row and the King’s Fund analysis in a report Under One Roof116, concluded that shutting down smaller practices and a single highly centralised model would not be suitable for all areas. A poll conducted for The Times revealed contrasting views of doctors and patients. Almost half of patients surveyed thought that polyclinics would improve the standard of care and access, and that seeing the same doctor on each visit or in an emergency did not greatly matter. GPs opposed them and argued for continuity of care.117 One private provider, United Health, obtained the contract to run three practices in North London. Sainsbury’s and Asda piloted the provision of surgery space to local doctors in stores in the evening. GP organisations feared that the strengths of British general practice were being undermined by officials lacking in understanding of primary health care.

Contractual issues

From 1948 until 1997, all GPs worked under a single contract for services, nationally negotiated and set out in ‘The Red Book’. The NHS (Primary Care) Act 1997 was passed shortly before Labour came to power and allowed health authorities to commission primary care services from GPs and others in new ways. A new style contract for Personal Medical Services (PMS) was often with a group or practice, instead of with individual GPs, and was for a defined package of services. Contracts were local, rather than national, with a firm linkage to quality; and services could be tailored to the needs of specific groups, such as the unemployed. It became possible to test alternative systems and to try our new mixes of skills. The standard contract did not meet the needs of all communities equally well and, in 2003, the Health and Social Care (Community Health and Standards) Act allowed PCTs to commission “anyone capable of securing the delivery of those services”. It established four contracts:

  1. General Medical Services (GMS) the traditional general contract, but with practices
  2. PMS
  3. Primary Care Trust Medical Services (PCTMS) in which PCTs provided services themselves, directly employing staff
  4. Alternative Provider Medical Services (APMS) in which the PCT contracted with an individual or an organisation, for example, independent sector organisations such as United Health Europe or Care UK, the voluntary sector, other PCTs, Trusts or even a parallel contract with GMS or PMS practices.

The legal basis of practice had changed, and commercial law was now involved in contracts, not public law alone. The monopoly of individual GPs as contractors was broken.118 Alan Johnson, Secretary of State, felt that existing practices had no God given right to particular populations. 

Personal Medical Services (PMS)

Flexible salaried contracts negotiated locally had been discussed for years and might meet the needs of inner-cities better. The traditional contract, highly structured, made short-term employment impossible. A salary was possible under PMS. Firms such as Boots, and some groupings of general practitioners took advantage of the opportunity.

PMS were piloted and became a mainstream option before evaluation was completed. GPs seemed willing to trade income for better conditions, freedom from out-of-hours working, from administrative responsibilities and an ability to work part time. Being an employee appealed to the doctors early in their career, those approaching retirement, and to women; and so PMS were found in affluent as well as deprived areas. There was a progressive shift towards a PMS often with salaried GPs working alongside ‘independent contractor’ GPs. In a succession of waves, the numbers of GPs involved increased steadily, and it became a permanent alternative to the GMS contract. By 2005, 40% of GPs worked under PMS contracts. Fixed one- to three--year contracts and a competitive salary of perhaps £55,000 for a ten-session week, made the prospect appealing.

Towards a new contract – contractual discussions

Both the profession and government wanted changes to the contract to reflect emerging patterns of practice. GPs wanted to reduce the scope of their responsibility to a core of essential services (so that additional work would be separately priced), to remove the obligation to provide 24-hour cover, and to be able to choose their pattern of work. Government wanted teamwork, better access for patients, and an emphasis on quality of care as in previous negotiations. The rewards should be for the quality and range of services, rather than speed in seeing patients. Postgraduate education and clinical audit would become mandatory. Many of the changes were only possible because of the increasing use of computers in recording practice activity. In June 2001, GPs voted to resign if an agreement could not be reached, and Alan Milburn handed over the perennially hot-potato of negotiations to the NHS Confederation, losing the expertise of his own team. The new NHS negotiators did badly and ended by paying vastly more money for somewhat less work. In April 2002, new proposals were sent for consultation immediately after the announcement of more money for the NHS.

Lists would be practice-based, not individual. Payment would be based on the number of patients, the services offered, the quality to which they were delivered, and the needs of patients taking account of their age, sex, deprivation and morbidity.

All GPs would provide essential core services, the care and management of those who are ill or think they are ill, and the general management of the terminally ill. Most would boost their income by offering additional services, including vaccination, cervical screening, antenatal care and the management of chronic conditions. There would also be enhanced additional clinical services, some of which would be needed everywhere, others might be specific to a particular area. A proportion of a practice’s payment would come from a quality element of the contract, which might include clinical quality standards (process), organisational standards (structure) and patient experience. There would also be an annual ‘achievement’ payment. Practices would be given money for infrastructure expenditure such as additional premises, staff or information technology. GPs would be allowed to drop out-of-hours responsibilities.

GPs had asked and got a radical new contract, losing 24-hour responsibility and being rewarded for high-quality care. The contract was based on targets – some 147 performance indicators. Government announced increased resources for primary care of some 30 per cent over three years. It used to be said that half a dozen people – three in the BMA and three in the Department – fully understood how GPs were paid. Now nobody could grasp the new contract: “more complex” said the BMJ, “than the Minotaur’s labyrinth”. It used a complex formula based on weighted list sizes which produced odd effects. Ultimately, by four to one, GPs accepted the deal and, in September 2003, the detailed terms were agreed. Implementation took full effect from 1 April 2004.

The Quality and Outcomes Framework (QOF)

The QOF became far more important than most had thought and a revolutionary change. Of 147 targets introduced in 2004, 76 were clinical and ten involved long-term conditions. Most practices improved the quality and range of services they provided to which some 20 per cent of the primary care budget was tied, resulting in significant increases in pay, which gave PCTs and government a financial headache The contract was estimated to have cost £300 million more than had been expected because the average net salary rose by over 30 per cent, to more than £100,000. The National Audit Office, in a careful account of the problem and the negotiations, reported that the additional money had achieved at least some (but not all) of the anticipated benefits. It had not helped deprived areas much, and there were doubts about productivity.119

Systems of organisation, PCTs and commissioning

Family doctors, for years administered by bodies established solely with them in mind, were now managed within a broader framework. After NHS reorganisation in 1974, they were within the remit of Area Health Authorities, but these remained at arms-length. However, the amalgamation of Family Health Services Authorities into Health Authorities (1996) altered the situation. Because primary care determined the work of the hospital services, health authorities and PCTs became managerially active. For the first time since 1948, general practice/primary care, and community care, were brought into a single organisation with a unified budget managing family doctors, running community nursing and setting the contracts for hospital services. PCTs became planners and funders, not simple administrators of GPs.

The rise and fall of fundholding

In the early 1990s, the Conservatives tried to contain increasing demand and rising costs by the purchaser-provider split and the introduction of GP fundholding. Hospital contracts were placed in two ways: by fundholders for their patients; and district authorities for the rest. Though voluntary, the take-up of fundholding was spectacular; by 1998 about 55 per cent of the English population were covered by some kind of fundholding arrangement. Fundholders were energetic, and academic evaluations suggested that they cut elective admissions and waiting times. They drove service reform and led doctors to consider management issues. Combining financial and clinical decision-making, fundholding harnessed the enthusiasm of GPs eager to develop their practices. However, having many commissioners and contractors increased transaction costs, and there was some evidence of a two-tier access to health care between patients of fundholders and patients of non-fundholders.120

GPs who opposed fundholding but wanted to shape secondary care responded by teaming up to advise health authorities through ‘locality commissioning’, a model more to Labour’s liking. On achieving office, however, Labour determined that fundholding would end in April 1999, at which time GPs were brought together within PCGs to commission all secondary care except mental health services. In England, around 500 PCGs, each covering populations of around 100,000, took over from nearly 400 health authorities, and the fundholders and locality commissioning groups. Later PCGs evolved into PCTs. From April 1999, all GPs in England and Wales worked within PCGs/PCTs that commissioned or purchased secondary care for their populations. PCTs were not definable in neat geographic terms, the areas being decided by the homes of those on the list of participating GPs. The PCTs were becoming mini-health authorities, whose chief executives had often worked for the previous health authorities. PCTs increasingly concerned themselves with the complexities of secondary care and beyond primary care. Recruitment of top calibre staff, however, was difficult, though mergers sometimes helped, as did sharing back-office functions.

PCTs increasingly influenced the way GPs worked. Responsible for implementing the new GP contract, they might employ GPs with special interests to achieve their goals or encourage the adoption of PMS contracts, which allowed the Trust greater management powers. Practice-based commissioning (PbC) in 2005 was an attempt to achieve the advantages of fundholding but not its downside. In spite of encouragement, it was not successful. The effects were patchy, with GPs slow to get involved. It was described by a primary care tsar as “a corpse not fit for resuscitation.”

Recruitment

The number of doctors completing vocational training had been dropping and practices found it more difficult to recruit new colleagues. Once, general practice was the first career choice of 40–50 per cent of newly qualified doctor,s but now it was down to 20–25 per cent. Many retirements were imminent. Most entrants to practice were now women frequently wishing to work part time; general practice seemed to offer a better work–life balance. Yet numerically the GP workforce was steadily increasing. In England in 1998, there were 28,251 GPs excluding registrars and those retraining and in 2007 there were 33,364. The increase in pay was one factor. 

In 2007, the membership of the RCGP was developed into a necessary requirement for entry into NHS general practice. Equitable distribution of GPs moved from the Medical Practices Committee to PCTs who managed recruitment within national guidance. PCTs were divided into groups according to their number of GPs expressed in terms of weighted population. However, there was no superior authority to ensure that the under-doctored areas got the most applicants.

Hospital and specialist services

During the decade there were:

  • staffing increases
  • substantial reconfiguration of hospitals
  • increased workload but falling waiting times of outpatients and admissions
  • introduction of systems to assess quality
  • a major hospital building programme funded by Private Finance Initiative (PFI).

Staffing

Nursing and medical school output rose to increase staff numbers and capacity as money was pumped into the NHS. There were 1.3 million NHS staff members in 2007, just over 50 per cent being doctors or qualified nurses (128,200 and 399,600). After years of negotiation, a new pattern of pay system, the Agenda for Change was introduced for all directly employed NHS staff, except very senior managers and those covered by the Doctors’ and Dentists’ Pay Review Body, to harmonise the conditions of service staff, provide a clearer system of rewards for staff working flexibly, and assist in the development of new types of job.

NHS workforce in England, 1997–2003

Workload and bed numbers

Hospitals were under increasing pressures. Demand and volume increased as better and less-traumatic forms of care became available. Each month acute hospitals saw over a million new patients in acute specialties, and a further million attended A&E departments. The reduction in the GPs’ gate-keeper role, particularly out of hours, was a factor in the rising demand.

In 1998, at a time of scandalously long waiting lists, the government established a National Beds Inquiry chaired by the Chief Economic Adviser of the Department of Health, Clive Smee.121 Reporting in 2000, it showed that the number of staffed hospital beds in England had peaked in 1960 at a quarter of a million, and then fallen steadily to 147,000 Elective admissions had remained static, but emergency ones had risen steadily to 60 per cent of the total. Those over 65 years of age were major and increasing users of the service. York University’s evidence to the study concluded that about 20 per cent of the days older people spent in hospital might not have been there if other, intermediate, facilities had been available.

The inquiry concluded that more, rather than fewer, beds are needed to meet the needs of patients in the twenty-first century NHS. It outlined three options: an increase in the number of acute beds; an increase in health services in the community; or the provision of ‘intermediate care’ services, to prevent avoidable admissions and make discharge home easier. Intermediate care had problems. Management in search of money to keep acute hospitals running had often closed the natural centres for such care, the GP hospitals and community hospitals. There was little evidence that such facilities could deliver effective outcomes in a cost-effective manner; the ‘hospital at home’ schemes had not done so. Neither had anyone explained how, if intermediate facilities were to be funded from money taken from the existing acute hospitals, those would keep within budget, increase their throughput and improve the quality of their own services.

A year later in 2001, after the Blair ’Breakfast with Frost’, the NHS growth rate was doubled. Capacity was seen as a restraint to the reduction of waiting lists and Alan Milburn accepted that more beds were needed. Bed closure to keep within budget ceased to be encouraged, and new PFI projects no longer had to reduce bed numbers. Authorities were asked to produce plans for increasing their capacity. Staffing shortages in almost all staff groups were a major restriction in the hospital service. Yet uncertainties remained. The NHS Confederation, representing NHS authorities and trusts in England, issued Why We Need Fewer Beds, a good analysis of the factors affecting bed numbers and the changing pattern of hospital care.122

Labour’s manifesto had promised to reduce the waiting lists inherited from the Conservatives by 100,000. Successive governments had grappled in vain with the problem and, against all predictions, Labour made great progress. There were three phases in government policy. From 1997–2000 government focused on waiting lists rather than waiting times, and slightly increased funding. From 2001–2004 funding increased dramatically and Alan Milburn chose to focus was on waiting time targets and performance management. Then from 2005–2007 government expanded supply, increased patient choice and introduced competition. Some Trusts paid huge sums to tempt doctors to work out of hours to reduce lists. Over the next few years, matters slowly improved. Sometimes figures were falsified, and the Audit Commission highlighted Trusts where there had been deliberate misreporting of waiting lists, as did the National Audit Office. As waiting times fell, targets were tightened from 12 to 6 months and then to 18 weeks from referral to treatment for elective care. By March 2008, something like nine out of ten elective patients were treated within this time span.

Hospital reconfiguration and design

Hospital development had been planned based on an assessment of local health needs. The development of the ‘market’ in the 1990s led to a move to business planning, financially driven, short-term and institutionally based. However, as new policies, such as quality, clinical networks, patient choice and changes in financial flow were developed, this became inadequate. Good analysis and forward planning were increasingly needed, especially as financial stringency made reconfiguration of services important.

Since1948, district general hospitals had been the building block of the hospital service, accepted by Powell’s Hospital Plan (1962)123 and the Bonham-Carter Report (1969).124 Reconfiguration was driven by increasing sub-specialisation and the recognition that outcomes were often better where work was concentrated. The reduced working hours of medical staff had made it difficult to staff smaller hospitals out of hours. Other factors included the closer alignment of research, education and services within academic health science centres to ensure that scientific advances were translated into better care and the hope that more care might be delivered by GPs in community-based centres.

These ideas were absorbed widely within NHS management and subsumed into the reports from the National Directors of Clinical Services, and later Lord Darzi. A report in 1998 of a joint working party of the BMA, Royal College of Physicians (RCP) and Royal College of Surgeons (RCS) suggested that a single general hospital now should serve populations of not less than 500,000. 125 Such hospitals should relate to a tertiary service provided for around a million and would contain independent departments of super-specialty care. The Senate of Surgery of Great Britain similarly believed it was essential to reconfigure and centralise trauma services for severe injuries, and the Academy of Medical Royal Colleges (the umbrella body of 20 UK colleges and faculties) looked at acute services, specialty by specialty, in a working party report in 2007.126 Emergency medicine required reorganisation, as fewer physicians could practise general internal medicine effectively – many sub-specialists lacked competence in common medical causes of admission. Medical assessment units and medical admission units were now usual. The RCP thought a new specialty should form – physicians whose prime responsibility would be to manage the acute medicine service, lead multi-disciplinary teams and support colleagues in A&E departments, in high-dependency units and on general wards.

As SHAs attempted to restructure services, for example, in north London, inevitably there were protests from local communities and MPs. In 2003, the Independent Review Panel was set up to advise the Secretary of State on contested reconfigurations, as had been proposed in the NHS Plan (2000). Over ten years, it reviewed 19 cases, sometimes supporting the proposals, but often finding that their basis was doubtful or they had not been presented effectively to local people.

Professional groups as well as management were involved in re-configuration, for example, clinical networks. Cancer and neonatal paediatric care networks involved cross-referrals from local units to more specialised ones. In ophthalmology in London, Moorfields developed multiple peripheral centres providing outpatient services and some surgical and inpatient care as well. By 2007, Moorfields was working with St Georges’, Barts and the Royal London, Homerton, Watford, Barking, Mayday Croydon, Northwick Park and Ealing. Moorfields had found a way of increasing the hospital’s population base, necessary for survival and its research and education, and providing a good clinical service to hospitals that might not have been able to provide it themselves. It linked with University College London through the Institute of Ophthalmology, and to City University as its academic partners. The Royal Marsden, similarly, opened a satellite unit at Kingston.

Treatment centres

In 2001, plans were announced to build 26 treatment centres in England at new and existing hospitals financed by PFI or public funds. Designed for ambulatory/day care, they would help to meet waiting time targets by increasing capacity, introducing competition, forcing efficiency and increasing patient choice. Some were NHS but independent sector treatment centres (ISTCs) were also built and opened. They offered fast, pre-booked day and short-stay surgery and diagnostic procedures for which there were often long waiting times, such as ophthalmology and orthopaedics. The Prime Minister announced that 250,000 consultant episodes, some 8 per cent of the total, would take place in the private sector. The first wave of ISTCs was guaranteed volumes of patients and payment some 15 per cent above NHS tariff costs to recognise the start-up costs. In October 2004, a second wave doubled the prospective caseload to 500,000. The contribution of the private sector was small, but its existence made it easier for hospital management to improve hospital working practices and reduce waiting times.

Many Trusts felt that they could provide additional capacity and resented attractive contracts to ISTCs when some in the NHS, including the pioneering one at the Central Middlesex Hospital, were working far below capacity. By 2005, 29 centres, NHS and private, were in operation and 100,000 patients had already been treated. There were concerns about the effect on the training of young surgeons and about the quality of the surgery. Assessment was virtually impossible because of poor routine data collection systems and the insistence that data was ‘commercial in confidence’. There seemed to be a policy decision not to undertake an evaluation. The Parliamentary Health Select Committee reported ambivalently on the independent centres in 2006.127 Waiting lists were falling but the contribution the centres were making was doubtful. The gloss was going off the idea. Seven of the second wave were cancelled and others were delayed. Some contracts were cancelled, and compensation paid.

Hospital building and the Private Finance Initiative (PFI)

Until 1991, all major capital expenditure in the NHS had been funded by central government. The NHS did not have to pay interest or repay capital, so in effect new equipment and buildings came ‘free’. But for 20 years, there had been a squeeze on capital expenditure and hospital building stock was in a poor state. The PFIwas always a trade-off – while the money was more expensive than central funds, these were unavailable and, through PFI, hospitals could be built that would otherwise never have existed. A building wave began.

From 1992, PFI became “the only game in town”. It became increasingly costly and cumbersome. Labour nevertheless developed this policy, later called public–private partnership, for it kept building off the public balance sheet. Major schemes were typically ‘Design-Build-Finance-Operate or DBFO’ – a private consortium designed the facilities to NHS requirements, built them, financed the capital cost and operated their facilities. The NHS paid an annual fee to cover the capital cost, maintenance of the hospital and any non-clinical services provided over the 25–35-year life of the contract, after which it would be handed over in good state. As contracts could include staffing and clinical services, unions opposed schemes in which private companies could set their own terms and conditions of service.

The number of schemes was overwhelming. Government argued that PFI would result in better hospital designs, the private partner taking on the risk of construction cost and time overruns, and more efficient maintenance. (Guy’s Hospital Phase 3 had risen in cost by over 300 per cent and was three years late.) Critics, such as Allyson Pollock, believed it was locking the NHS in to expensive 30-year contracts. PFI more than doubled the cost of capital as a percentage of Trusts’ annual operating income.128 A select committee of the House of Commons reported in 2002 that PFI was being blamed for ills not directly related to it, whereas the many benefits ascribed to it had yet to be proved; and recommended that more capital was found from central sources for major schemes so that PFI projects could be compared with conventionally procured ones.

An “outline business case” was prepared with a detailed statement of content and who should bear the financial risks. There was usually a reduction of beds in the hope that care would be transferred to the community, and to cut costs. Between May 1997 and March 2002, 64 major PFI hospital developments were approved with a total capital value of more than £7.5 billion, and 11 were completed and operational by early 2002, including Carlisle, Dartford & Gravesham, South Buckinghamshire, Greenwich, North Durham, Calderdale, South Manchester, and Norfolk & Norwich, Hereford and schemes at Worcester and Barnet & Chase Farm. For a few schemes, (Sheffield) government pledged public funds. By 2006, 24 PFI schemes were complete and operational; with a total capital spend of £2.1 billion. Another 14 schemes were approved, to a value of £3 billion. In preparation were other schemes worth another £12.1 billion, at various stages of negotiation, including cancer and cardiology at Bart’s and the London and the University Hospital Birmingham schemes. With an annual spend of some £3 billion, this was, in real terms, the largest programme that the NHS had seen. 

The downside was that the cost could be 20–30% more than money lent by the exchequer, and closure of PFI units would incur high compensation payments. Future hospital planning was hog-tied as there were surprisingly high costs in altering a hospital. The savings were sometimes less than had been calculated. PFI appeared to be less open to outside scrutiny, sometimes led to developments that might be smaller than clinically required, and to create a substantial future revenue burden. Relatively modest expansions were hanging round the neck of Trusts. Some were so costly to run that they risked permanent deficit, and savings could be made only by cutting services at older hospitals that were cheaper. Decisions on closures were likely to be made on economic grounds, rather than on patient need and care.

In South East London, the PFI schemes at Queen Elizabeth Hospital Trust in Woolwich, Bromley and Lewisham imposed an immense burden; the QE was in effect bankrupt because of the annual payments to be made. A commissioner-led reconfiguration programme, A Picture of Health, led to the merger of the three organisations to create South London Healthcare NHS Trust, and was expected to solve the problem. Its planning took six years and failed. The King’s Fund examined the lessons learned. 129 In 2006, the government reviewed schemes to ensure that they were financially sustainable.

Rating hospital Trusts

A series of attempts was made to assess and publicise the quality of Trust services. The first system in 2000 gave zero, one, two, or three stars. Indicators such as inpatient and outpatient waiting times, cleanliness and financial results, were pulled together in a formula that relied on the data available and not the effectiveness of clinical care. Rating systems threw up odd results. Prestigious hospitals were found to be unsatisfactory, and many of those failing had major problems to face, rebuilding, hospital mergers or the implementation of major new information systems. Star ratings were dependent on data the Trusts had supplied and some was fallacious. The Audit Commission reviewed progress towards the NHS Plan in June 2003, and found little correlation between star ratings and management, financial stability or clinical outcome.130 Similarly published results from Dr Forster showed that standardised hospital mortality had no correlation with the number of stars awarded to hospital Trusts.131 The star system was dropped in favour of systems later developed by the Healthcare Commission.

The effects of the NHS Plan on hospitals

Many of the proposals in the NHS Plan were designed to improve hospital care: more hospital building, more doctors and more nurses. Staffing was a constraint on development of the NHS. Workforce Development Confederations were established in 2001, commissioning training programmes. Coterminous with SHAs, they were integrated with them in 2003. A strategy for HR ’More staff working differently’ was issued in July 2002.132 There should be improved hospital cleanliness. Hospital food, long a subject of criticism, should be better. A report by the Nuffield Trust in 1999 had shown that much food was wasted because it was served at rigid times, was unattractive, and patients who needed help with feeding often did not receive it.133 A team of professional chefs was recruited and produced a book of Chef’s Recipes. The cost of the recipes being high, hospitals delayed their introduction. Some patients preferred their traditional cottage pie to celebrity cuisine. By 2003/04 two-thirds of outpatient appointments and elective admissions would be pre-booked, and no longer subject to waiting lists.

The Modernisation Agency, established in the wake of the NHS Plan, worked hard (and expensively) to improve matters. On its closure, it distilled its recommendations for the improvement of hospital systems into ten ‘high-impact’ changes:134

  1. Treat day surgery as the norm for elective surgery
  2. Improve access to key diagnostic tests
  3. Manage variation in patient discharge
  4. Manage variation in patient admission
  5. Avoid unnecessary follow-ups
  6. Increase reliability of performing interventions: a care bundle
  7. Apply a systematic approach to care for people with long-term conditions
  8. Improve patient access by reducing number of queues
  9. Optimise patient flow using process templates
  10. Redesign and extend roles.

Cross-border health care

A ruling by the European Court of Justice in 2001 that medical care in hospital was subject to European law on the free movement of services, and that prior authorisation was an obstacle to free movement of patients, raised important issues. The Court ruled that patients had the right to seek treatment abroad if they faced undue delay. The legal background was strengthened when, in 2003, a High Court judge, Mr Justice Munby ruled that “if treatment in this country under the NHS is unduly delayed, then an NHS patient is entitled as a matter of European law to travel to another member state, there to be treated on terms requiring the NHS to reimburse the cost of that treatment. Alan Milburn agreed to allow overseas treatment if, after clinical assessment, the patient wanted it and the PCT could meet the cost from its budget. The Department of Health co-ordinated pilot areas for authorities wishing to buy packages of operations from continental hospitals. In January 2002, the first group of nine patients left for Lille and accommodation a world away from crowded NHS wards.

The European Commission published proposals for how citizens of the European Union should obtain health care in other member states in 2008. The commission’s proposals had three main strands: first, values (universality, access to good-quality care, equity, and solidarity) and principles (quality, safety, care based on evidence and ethics, patient involvement, redress, and privacy and confidentiality); second, aspects of cross-border care not already covered by existing legislation, such as that covering people who fall ill while temporarily abroad – key elements relate to people who choose to go abroad to obtain care; and third, mechanisms to foster European collaboration on health services, such as shared facilities in border areas, common methods of technology assessment, and centres of excellence for rare conditions. 135

Services in London – Turnberg

In 1997 Frank Dobson commissioned Sir Lesley Turnberg and a panel to undertake a strategic review of health services in the capital.136 In his response report, Turnberg said that London could no longer be considered over-bedded compared with the rest of England. London had made radical efforts to reduce the number of ‘surplus’ beds; but now elective surgery and emergency admissions were jeopardised. The problems of inner-city primary care remained. The panel considered a number of capital developments within a five sector plan already adopted by the University for Medical Schools, approving the development proposed for the University College London hospitals.

A £422 million PFI to unite the University College London Hospitals on a single site came to fruition with its opening in 2005. In 2001, UCLH purchased the old National Heart Hospital, now converted into a state-of-the-art private heart unit. Turnberg modified the proposals for St Bartholomew’s Hospital and The Royal London, and planning for their redevelopment made progress. Attempts were made to improve primary health care. There had been high hopes for the London Initiative Zone, established in 1993 to introduce innovative approaches to problems and develop cost-effective care outside hospital. A review five years later showed a mixed picture. London still had fewer young GPs, more single-handed practices, and larger lists. Although primary care in the capital was improving, it was not doing so more rapidly than elsewhere in the country. The initiative was terminated.

Organisational change in London

There have been two main patterns for London’s health service planning: the ‘starfish’ with a radial organisation reflecting the transport links; and the ‘doughnut’ with the élite hospitals, the cream, in the middle. Bevan had chosen the ‘starfish’ and London had been divided into four radial health regional hospital boards. The arguments for such a pattern were now weaker. A London region had been proposed in the Tomlinson Report (1992). In June 1998, the Secretary of State, Frank Dobson, decided that, from April 1999, London would become a single NHS region (a ‘doughnut’). This would give greater cohesion and coterminosity with government departments and local agencies, including local authorities. Change would have ripple effects on the surrounding areas. A new boundary at the western side of Bedfordshire would separate an Eastern region from a large ‘L’ shaped South-eastern region. Covering 7 million people, the London region had 100,000 employees and a budget of £7 billion. However, the University decision to go for five radial SHAs and the decision to abolish first regions and then departmental regional offices returned the starfish to favour. Five radial SHAs were established within London in 2002 – but, in 2006, they were amalgamated into a single NHS London SHA. Changes in organisational structure was now making centralised planning difficult and the establishment of Foundation Hospital Trusts increased the power of some hospitals to determine their own future.

A Framework for Action (Darzi on London, 2007)

Ten years after Turnberg, David Nicholson, then Chief Executive of the London SHA, commissioned Professor Sir Ara Darzi, to review the health services in London. His first report in March 2007, The Case for Change, outlined reasons for change, for example, health inequalities, failure to meet patients’ expectations and value for money.137 The follow-up, A Framework for Action, 138 was published in July 2007, just after the author had been appointed by Gordon Brown as a junior health minister and ennobled so he could act for Labour in the House of Lords.

The report was the product of many hands and had national influence. Technical groups had looked at the nature of the London population, the trends and likely health problems in London in the future. Clinical groups considered the care pathways best suited to differing groups of patients. A changed pattern of health service organisation was proposed. London primary care would be provided by 150 polyclinics, handling much care previously undertaken in hospitals. The number of major acute hospitals would be cut by more than a half, and there might be some 12 specialist hospitals and eight to 16 major acute hospitals. Trauma, accidents and emergency surgery would be reorganised. Supporting papers dealt with the clinical problems of other specialties, including maternity, long-term, mental illness and terminal care. Patients with emergencies would be admitted to the hospital best suited to their needs.

Brilliant in conception, clinical in slant, but a recipe for turbulence according to The Guardian, it was a blueprint for a radically different NHS, not unlike the 1920 Dawson report. Some conclusions flowed directly from the evidence. Others were less well-founded, for example, there was no evidence that polyclinics were the answer for every locality, reducing ease of access even were they affordable. Neither was it clear how far this top-down concept fitted with patient choice, the freedom of Foundation Trusts, practice-based commissioning or payment by results. Darzi had no knowledge of general practice. There was a naive belief that polyclinics could be created and money saved by transferring perhaps half hospital care into them. Other reports followed.

Medical education and staffing

Medical staffing

The sixth decade saw revolutionary changes in medical education, staffing and the roles of doctors, the establishment of new and expansion of established medical schools. There was a major increase in the number of medical students, exponential growth in medical knowledge, with reshaping of the undergraduate syllabus, and attempts to improve postgraduate education with a new pattern – Modernising Medical Careers (MMC).

New contracts with hospital consultants and general practitioners aimed to improve productivity and quality, but major errors in negotiation led to massively increased bills, and sometimes to a poorer service.

Medical schools and medical education

During the decade, the numbers of students starting studies rose by two-thirds. Doctors number some 100,000 out of more than a million workers in the NHS; government is responsible for the costs of medical education and is the main employer of doctors. The 1968 Royal Commission on Medical Education (Todd)139 recommended a doubling of medical school intake to 4,230 by 1980, numbers not achieved until 1992. Planning was often based on pessimistic assumptions about growth in health expenditure rather than changes in the size of the population or the services needed. ,Treasury insistence on limiting expenditure and professional concerns about medical unemployment went virtually unchallenged. Small increases were suggested from 1991 onwards but then policy altered.

In view of the increasing workload and the working life of doctors (30 years for a man and 22 for a woman) the third report of the Medical Workforce Standing Advisory Committee in 1997 proposed that the UK should aim for self-sufficiency. Alan Maynard, the Health Economist, was scathing about self-sufficiency, seeing “nothing wrong with employing people trained in excess by misguided foreign states”. Medical school places would be expanded by 40 per cent.140 The capacity and quality of the NHS were crucially dependent on professional staff, and it was clear that there were too few.141 A major expansion of training facilities was set in hand. The annual student intake would be increased by 1,129 above the 1996 figure of 4,820. When Labour announced additional money for the NHS in July 1998, some was for the expansion of medical (and nursing) education. This enlarged intake coincided with a resource reduction from university sources for medical schools, particularly as the money from research grants fell. The schools felt that they were being expected to train more students at the same time as they had to make medical school staff redundant. 

The NHS Plan;(2000) promised a further increase of 1,000 in medical school places to nearly 7,500. The Joint Implementation Group, (joint between the Higher Education Funding Council for England (HEFCE) and the Department of Health), allocated new places, and universities scrambled to secure new student places and the money that went with them. In England there had been no new schools for 30 years, but Alan Milburn announced new schemes in 2000 and 2001. The pattern of medical education was changing radically and the Wanless Review (2002)142 agreed that the need for substantial increases in the demand for health care might lead to a substantial shortage of doctors.

The welcome but abrupt rise in numbers took place between 1997–98 and 2006/07 with a 71 per cent increase in medical school places. Intake to English medical schools increased from 3,749 in 1997/98 to 6,194 in 2006/07, partly because of new medical schools, bringing the number to 24. By 2008/09, the number of doctors graduating from medical schools had increased to around 5,684.

Medical schools in London had found difficulty in providing adequate clinical experience; similar problems now appeared elsewhere. Reduction in bed numbers, lack of clinical academics, and changing patterns of work that reduced the time middle grade doctors had for student teaching compounded the problem. Medical schools found it increasingly hard to choose from the many applicants with good A level results. Most of them adopted an additional UK clinical aptitude test (UKCAT) and the Graduate Australian Medical School Admissions Test (GAMSAT) was used for several UK graduate entry medical schools.

New medical (joint) schools 

New stand-alone school

New schools associated with existing schools

Peninsula Medical School (2002)

East Anglia (2002)

Newcastle with Durham (2002)

Hull York (2003)

Warwick (Independent 2007)

Brighton and Sussex (2003)

Keele (independent 2007)

Nottingham/Derby

he new medical schools took full advantage of the GMC reforms, integration of basic and clinical sciences, use of community settings for teaching, early contact with patients and a wider basis for student selection. They encouraged joint training of medical, nursing and other health professions. Some, such as Keele, began as a clinical site for an existing school (Manchester) but applied to be a school in its own right, admitting its first students in 2007. Universities previously without a medical school lacked the experience of older established institutions and might not understand the close linkages of service and education.

Traditionally more students from social class 1 were accepted into medicine than the combined totals of classes 3, 4 and 5. Many thought that tomorrow’s doctors should reflect the diversity of the community and wanted to broaden the social and ethnic range and to attract graduates of other disciplines. Some attempted to attract bright students from low-achieving schools and developed an extended medical degree programme (King’s College). Extra academic support was needed, but such students had a high rate of success. Shorter four-year graduate entry courses were introduced on the recommendation of the Medical Workforce Standing Advisory Committee (MWSAC) for mature students.

The CMO summarised progress in a document, Medical Schools – Delivering the Doctors of the Future,143 and a study in 2007 by the University of Birmingham showed success in increasing the number of mature entrants and lower numbers from social classes 1 and 2.

Changes to the curriculum

The guidance from the GMC in 1993 (Tomorrow’s Doctors)144 modified the curriculum. The vast increase in medical knowledge meant that swathes had to be moved into the period of postgraduate training. Emphasis moved from gaining knowledge to a learning process that included the ability to evaluate data as well as to develop skills to interact with patients and colleagues. Medical students in some schools might now qualify without having delivered a baby or repaired a tear. The stress on factual knowledge decreased and more was placed on self-learning, communication skills and sociological understanding. The guidance was revised in 2003 and again in 2009 and continued to stress ‘touchy-feely’ qualities as well as the need for a knowledge base. The curriculum tried to integrate scientific knowledge and clinical practice from the earliest weeks and to encourage students to be problem-solvers. Education became ‘outcome focused’ and ‘topic’ based. Problem-based learning and teaching across professional disciplines was introduced, particularly in the new schools. Students were increasingly learning outside the walls of the teaching hospital. The foundation in biomedical sciences now included cell biology, molecular biology and genetics. Older professionals – and some patients – were worried lest excellent interpersonal communications mask ignorance of basics, such as the ability to name the main bones or know the anatomical positions of nerves. In 2002 the new Peninsula medical school removed anatomical dissection of the body completely from its medical course; anatomy was now studied in different ways including sophisticated imaging. 

The demography of the student population was changing. White males were under-represented, while conversely there was a substantial proportion of men and women from an Asian background. A discussion paper by the BMA on the Equality and Diversity in UK Medical Schools 145 showed that, in 2003, 21 per cent of those were over 21, compared to 9 per cent in 1996; 61 per cent of entrants to UK medical schools were female, (29 per cent in 1963) and 59 per cent came from the highest social classes. This report was updated in 2009.

The NHS and the medical profession had to adapt to the increasing number of women in the profession – just over a third. President of the RCP, Professor Carol Black, thought the high proportion might have substantial long-term effects.146 Medicine, previously dominated by white males, was being feminised. Would the profession maintain in the future the same status and influence, in view of the difficulty women had in doing all the things formerly seen as part of professional life – research, teaching, medical politics, societies, committees, regulatory bodies and government advice? 

Medical staffing

Developments in medicine, such as interventional radiology, altered the skills mix required by the NHS, though lengthy training reduced the ability of doctors to change specialty. Immigration eased the problems of NHS staffing; in the last decades of the century, some 40 per cent of doctors entering the NHS came from overseas. However, the decision to expand medical student intake to achieve self-sufficiency was confounded by the inability to give priority to home-trained graduates over skilled people from overseas who had a right to enter the UK job market. The medical marketplace being international, but how did one take account of doctors training in Europe, British citizens who go to overseas medical schools and then return, and well-qualified doctors from elsewhere in the world?

The Calman scheme, which affected the later years of training, was followed in 2004 by the new MMC initiative.147 MMC, established in 2003, was a set of changes aimed at addressing long-standing problems with the UK medical education system and the wider medical workforce. In 2004/05 the Postgraduate Medical Education and Training Board was established to develop a single framework for postgraduate education and training, taking over the responsibilities of the Specialist Training Authority of the medical Royal Colleges and the Joint Committee on Postgraduate General Practice Training. MME, like Calman, had its origin within the Department of Health, where medical training traditionally met the needs of the hospitals and the NHS, education sometimes being incidental rather than central. It aimed to change the senior house officer (SHO) grade that contained half of all doctors in training, and combined a high workload with poorly structured training opportunities. In 2005 a Foundation Programme Curriculum was introduced to provide structured two-year training with exposure to a broad spectrum of specialties including accident & emergency, obstetrics & gynaecology, and anaesthetics. Each trainee would experience primary care and perhaps the smaller specialties and academic medicine, not normally available at this stage of training. The second year would effectively replace the SHO grade, and require high-quality training with progress dependent on competence rather than time in post. As in the case of Calman, education now seemed to be placed ahead of service needs. The changes to a tried and trusted system produced problems for young doctors rotating through multiple posts, and made a smaller contribution to running the service. Trusts relied increasingly on staff grades and consultants.

The London Trauma system

Then doctors would opt for either a general practice registrar or a specialist registrar post with formal training programmes. There were two types of specialist registrar training. In type I there is the assumption that training should lead to consultancy, and there are annual in-training assessments. Satisfactory performance leads to the award of a Certificate of Completion of Specialist Training (CCST) and entry to the Specialist Register held by the GMC. Type II specialist registrars have fixed-term training appointments, and the programmes are designed to meet the needs of the individual doctors, but they do not lead to a CCST. By 2006 the new scheme was in operation but it ran into difficulties and, in 2007, it was worse. Young doctors from several years were competing for a single year’s posts, and the computer failed to select appropriately qualified junior doctors for training posts. Doctors in SHO posts had to compete with those from the growing UK medical school output and with doctors who had come from overseas, so there were far too many applicants – 28,000 applicants for 15,500 training places. The system could not cope with the volume of applications, limited the number of applications doctors could make, made judgements on doubtful criteria ignoring past experience, and failed to produce adequate shortlists for interviews. Many faced the blighting of their careers, and possible emigration. The medical profession was united in protest along with hundreds of candidates. Professor Alan Crockford, the National Director of MMC resigned. The Department of Health announced an urgent independent review, chaired by Sir John Tooke, Dean of the Peninsula Medical School, which issued an interim report in October 2007.148 MMC had been a sorry episode from which nobody had emerged with credit. The changes had been rushed, poorly led and badly implemented.MMC did not provide doctors with enough broad experience because it encouraged them to specialise early in their careers, and did not allow for enough flexibility to meet NHS needs. The report called for fundamental reforms for it “was unlikely to encourage or reward striving for excellence”. A modified and more local process for the coming year was introduced. If UK medical graduates, trained at great cost, could not obtain specialist training because of a large number of applicants from outside Europe, then it was right to consider change. MMC was not flexible enough and, in future, those being trained must be able to move between specialties and into research and academia, or into a post that does not head towards being a consultant. A linear pathway with no opportunity to change helped nobody.

His final report was published in 2008 and argued for the separation of the first two years, allowing universities to guarantee a first medical post to their graduates. It also recommended that postgraduate education should be managed by a new body – NHS Medical Education England (NHS-MEE) – and taken out of the hands of the Department of Health after its mismanagement of funding and job applications. The Department of Health’s response in February 2008 glossed over the gravity of the problems there had been. Among the recommendations accepted was to bring the Postgraduate Medical Education and Training Board into the GMC by 2010, bringing all stages of medical education and training under one roof. In May 2008, the Commons Select Committee on Health was highly critical of the chaotic planning.149

Non-consultant grades

For Trusts, the first priority was to keep the service running within restrictions on working hours for doctors in training and the European Working Time Directive.150 Trusts increasingly appointed doctors to hospital posts created for this purpose despite ceilings on training grade numbers. The new posts were generally in acute specialties and did not conform to the standard NHS grades. Those roughly at SHO level were usually referred to as ‘Trust doctors’, and those with three or more years of relevant experience as staff grade, hospital specialist or ‘associate specialist’, non-consultant career grade posts. The doctors appointed were often from outside the European Economic Area (EEA), had widely varying experience, and might not have specialist qualifications. They were employed to work, not study, and there was often no strategy for their education or supervision. By 2003, there were around 5,000 doctors on local contracts for whom the national grade was unclear. It was hard to move from such posts to consultant ones. Some remained in such jobs until retirement. Their growing numbers created an underclass of doctors.

European Working Time Directive

The European Working Time Directive substantially changed the way medical care was provided in hospital; to cover the work, hospitals needed more doctors than existed in the current training-post quota. Covering nights and weekends would be difficult as the Directive specified no more than 48 hours work per week, 11 hours continuous rest in 24 hours, 24 hours continuous rest in seven days, a 20-minute break in work periods of over six hours, and for night workers, an average of no more than eight hours work in 24 hours.

The effect of the Calman pattern of training and the European Working Time Directive was to reduce substantially the time between becoming an SHO and appointment as a consultant from some 30,000 hours to 8,000 or less. It seemed inevitable that the experience and skills of newly appointed consultants would be less than in the past, probably as generalists and almost certainly in advanced subspecialties. The NHS – and many European health care systems – had to undergo dramatic changes. The problems of providing a safe service were complex and hospitals struggled to conform. In the middle of the night, a patient needed somebody to take complex decisions, and with the competence to carry out specialised procedures – impossible without appropriate training and experience.

Recruitment

The loss of doctors from the NHS remained stable. Surveys of those qualifying from 1977 onwards showed that some 80 per cent of both men and women were working in medicine in the NHS. The proportion working in general practice, however, had fallen. There was an accelerating trend for consultants to retire early and greater demand for them in a consultant-delivered rather than a consultant-led service. There were higher numbers of women doctors in the hospital service (33 per cent in 1998), some of whom chose to take a career break. About half the women were working part-time. Studies suggested that young doctors viewed commitment to the NHS differently. While prepared to be committed to fulfil a reasonable contract, they demanded time for self-fulfilment and family responsibilities in a way that previous generations of doctors had not.

There was a shortage of indigenous, UK-trained doctors, both in general practice and in hospital medicine, that could only be remedied in the long term by increasing student numbers, as a result of which medical schools were expanded. The NHS continued to rely extensively on doctors from overseas. The high proportion of women medical students was now feeding into the training grades and it seemed that women doctors remained very selective about the specialties they chose. Specialties with a major on-call commitment – for example, cardiology and gastroenterology – were substantially less popular with women than oncology or radiology.

New registration by country and area of qualification

Year

UK

EEA excluding UK

Non-EEA 

Total

1989

3,568 

1,315 

3,110 

7,993

1997

4,030 

2,022 

4,358 

10,410

1998

4,241

1,730 

3,923

9,896

1999

4,304

1,512 

2,943

8,759

2000

4,457

1,380 

2,833

8,703

2001

4,279 

1,425 

3,139

8,895

2002

4,404 

1,617

5,224

11,245

        

European Economic Area (EEA) is a new name for the European Community (EC). The term EEA means countries who are either members or who have bilateral agreements with the EEA.
Source: GMC 2001/02.

The consultant contract and private practice

For many years, the UK pay system for consultants was a fixed salary (with incremental points) and selective bonus payments (distinction awards) that were introduced early in the history of the NHS. Consultants with a full-time contract could undertake limited private practice, with remuneration no higher than 10 per cent of their NHS salary. Those with a part-time contract (including maximum part-time at 10/11 of a full-time salary) could undertake unlimited private practice.

By the end of the 1990s, both the doctors and the government thought that the contract, though modified over the years, was now inappropriate and substantial revision was necessary. Private practice remained a bone of contention. The consultants’ negotiating body, the Central Consultants and Specialists Committee (CCSC), published proposals in October 2000 and the government published its own the following February which were reminiscent of those of the 1974 Labour administration and reflected the NHS Plan. The government, supported by NHS managers, wished to tighten the grip over consultants and reduce their freedom, especially as far as private practice was concerned. There would be a substantial increase in the number of specialists – not a new idea, but welcome. Proposals to merge distinction awards and discretionary points onto the pay scale, and to make the system more open and ‘fair’, were also welcome. Clinicians were concerned about proposals to review consultants’ job plans and to introduce appraisal systems. Most controversial was a suggestion introduced into the NHS Plan that newly appointed consultants would work exclusively for the NHS for the first seven years of their career. Existing full-time consultants, although having the right to undertake limited private practice, would have to prove that they were fulfilling NHS requirements. Full-time commitment would bring significant financial rewards. However, the BMA opposed in principle the view that consultants should be able to do nothing outside their NHS contract and saw the proposal as a vindictive attack of uncertain legality.

The doctors’ leaders spent two years negotiating a contract that would give consultants considerable increases in pay in return for agreeing to work more flexibly – possibly “unsocial hours”. In June 2002, the contract was put to the profession. Government abandoned attempts to place private practice off limits for younger consultants, but the consultants’ obligations to the NHS were set out more precisely. Young consultants would have to offer the NHS two additional sessions before they did any private practice, and consultants seven years into their appointment would give the NHS an extra session. To deter consultants from early retirement, there would be annual increments over 20 years. There would be greater managerial control over the consultants’ working week and an agreed job plan and work timetable.

The new contract offered more money in return for accepting greater managerial control and the potential to be obliged to work unsocial hours. The negotiators misinterpreted the mood of consultants, and both the junior doctors and the consultants rejected it by a margin of two to one.. The BMA CCSC chair, Dr Peter Hawker, resigned. Three major areas of concern were: deeply unhappy relationships between NHS managers and doctors; fear that consultants would be subjected to unreasonable or unachievable demands because of pressures on NHS managers to meet government performance targets; and that, on top of emergency work, doctors would be forced into working unsocial hours on a routine and long-term basis.

New negotiators asked the government to renegotiate the contract, suggesting comparatively minor changes. In October 2003, consultants and specialist registrars voted by three to two to accept the deal. The contract was based on job planning and it rapidly became obvious that consultants had frequently and genuinely been working more than their contracted hours. Now the Department and NHS management wished everything to be spelt out – consultants did just that. The costs of the new contract proved far higher than had been predicted, sometimes throwing hospitals into financial deficit. Back-pay alone might amount to tens of thousands of pounds, and it was joked that the consultants’ car park resembled the most up-to-date car showroom. In April 2007, the National Audit Office confirmed what had become apparent:. consultants’ pre-existing workload had been underestimated, and they now received some 25 per cent more money without apparently working any longer hours.

Nursing

Nursing education and staffing

Traditional nurse education had tended to be task-based, with an accent on the safe performance of routine duties, often by rote, and sometimes without explanation of the reasons why. In spite of some relaxation of discipline, the student nurse’s life was often still not his or her own. Hospital demands came first. In many ways, life had changed. Patient care had become immensely more complicated with high-technology health care and a myriad of new preparations on the drugs trolley, many with cross-reactions with each other. People’s attitudes had changed; both staff and patients. Wards had a high throughput and, with increasingly powerful forms of treatment, hospitals could be dangerous places. Nurse education had to prepare staff for a vastly different world. 

In 1988 the government accepted a new university-based system of nurse education – Project 2000 – a programme designed to meet the new demands. A step-change that was needed, it was firmly in educational hands. Trainees had student status but, unlike other university courses, they did not pay fees. Education was increasingly centralised. While there were some 40 sizeable hospitals in London that once would have had their own school of nursing, now nine universities educated students. They were larger and many had facilities of a quality undreamt of in the earlier hospital schools. Entry to nursing had previously been controlled by nursing schools associated with a training hospital, and regulations governing the state-based examinations, which included academic criteria. Now the UK Central Council for Nursing, Midwifery and Health Visiting (UKCC), and (from April 2002) the Nursing and Midwifery Council (NMC) continued to demand minimum academic criteria, but it was the universities (often the new ones – previously polytechnics) that selected students, and their admission criteria varied, being higher at some than others. In some schools, applicants were primarily white girls aged 18, while at others the average age might be in the late 30s, often from ethnic and minority groups who had already raised a family and had experience of life. Academic and examination results were still desired but ‘access courses’ were available. Selection might involve appraisal of personal attitudes, numeracy and literacy; almost half the applicants failed such tests. After a core course in their first year, students followed one of four pathways: adult nursing, children, mental health or learning disability. Those aiming for adult nursing formed the majority.

The number entering each college was determined by a workforce confederation coterminous with an SHA. Trusts played a major part in determining training intake, but their calculations did not take into account non-NHS demands for nurses – the rapidly expanding nursing home sector and NHS Direct. The size of the entry was progressively reduced during the 1990s, a time with no problems of retention. By the latter years of the 1990s, the vibrant economy offered many more possibilities to young people, and for the first time ever, there was a shortfall in candidates for nurse-training places. The King’s Fund thought that people were more critical of nurses and perhaps there were now fewer altruistic young women and more alternative careers were available. Parents might view nursing as a low-status occupation, poorly paid, and not the best choice for their children.

While service-based training had untoward effects, the new ‘uncoupled’ system also had its problems. There were tensions between the universities and the NHS. Nursing academic staff found it hard to manage the multiple roles of teacher, researcher, administrator and clinician, for some worked both in the School and a Trust. Because nursing students were not now members of the staff but ‘supernumerary’, their clinical experience lessened. Responding to concerns about this, colleges of nursing began to increase the time spent in a clinical environment to 50 per cent. Newly qualified nurses were often ill-fitted to take responsibility, hardly surprising given the immensely more pressurised environment they were entering. Sometimes hospitals, disillusioned by the illiteracy and numeracy of newly qualified nurses, introduced their own recruitment tests to ensure the safety of patients, and many failed – more from some colleges than others. Mentoring schemes existed for the support of the new nurse and the safety of the patients.

Frank Dobson, then Secretary of State, blamed the nursing shortage in part on Project 2000. He thought the emphasis on the academic had put off some potential recruits. A better balance between practical and academic components of nurse training was needed and students should have more contact with the NHS earlier in their courses. A Department of Health strategy, Making a Difference, had two messages: nurses and nursing were valued and should be more powerful, but it was critical of the academic drift.151 Perhaps nursing had taken the wrong path in its initial period in higher education?152 Nurses should have better working conditions, a more flexible career structure allowing breaks in training, and better paid ‘consultant’ nursing posts for the most highly qualified. A new model of nurse training, with an accent on developing practical skills earlier, became the standard across England. The NHS, as well as educational interests, was increasingly involved in selection – students would go to the wards earlier in the course, for longer periods and had a ‘home’ hospital to encourage them to feel more part of the NHS. The NMC replaced the UKCC in April 2002, more training places were created, and proposals for cadet schemes were encouraged. A Commission for Pre-Registration Education was established by the UKCC to explore recruitment and educational issues and its report, Fitness for Practice, (1999), found that, at registration, nurses lacked practical skills.153 It recommended a shorter preliminary theoretical programme. Early in the course there should be clinical placements, more experience of the 24-hours-per-day, seven-days-per-week nature of health care, and a period of at least three months supervised clinical practice towards the end of the course.

It had been hoped that basing education in the universities would reduce the drop-out rate, but a National Audit Office report in 2001 showed that 20 per cent of student nurses left during their course, and a further 20 per cent did not subsequently join the NHS. The reasons were complex. Many said they had financial problems, although four out of five students regularly had paid jobs. Sometimes students had a vision of what nursing was about, and it was abruptly shaken by their experiences on the wards. Modernising Nursing Careers,154 published in 2006 by the UK’s four health departments, identified four priorities: to develop a competent and flexible nursing workforce; update career pathways and career choices; prepare nurses to lead in a changed health care system; and modernise the image of nursing and nursing careers.

Subsequently the Department consulted on a new structure for nurses’ careers. Five broad pathways were suggested: Children, family and public health; First contact, access and urgent care; Long-term care; Acute and critical care; and Mental health and psychosocial care. The NMC reviewed pre-registration education, which later culminated in a proposal for an all-graduate nursing workforce.

Staffing

Nurse shortage was a global challenge. Demand continued to grow but, in many developed countries, the supply was falling. An ageing nursing workforce was caring for increasing numbers of elderly people. Low levels of trained nursing staff could lead to poor care, to low morale and loss of staff. A landmark study of the effect of nurse/patient ratios in acute surgical units in Pennsylvania hospitals showed that the chance of patients dying within 30 days of admission increased by 7 per cent for every patient over four for whom a registered nurse was responsible; and nurse burnout increased.155

The NHS in England employed over 300,000 whole-time equivalent registered nurses and midwives. Since 1997, the number leaving the professions outstripped the number of entrants. In 1997/98, 16,392 nurses and midwives joined the UKCC register and 27,173 left. The average age of nurses and midwives was rising. Nearly half of NHS nurses and midwives were aged over 40. Attempts to solve the situation included improving retention, broadening the field of recruitment (including mature entrants), attracting ‘returners’, and importing nurses from other countries – a few of which had an oversupply.

Registered nurses, England

Year

Whole-time equivalent nurses, England

(Not including nursing assistants)

1998

304,563

2000

316,752

2002

346,537

2004

375,371

2005

381,257

2006

374,538

2007

376,737

Source: Health and Social Care Information Centre.

Inner-city and teaching hospital Trusts were worst affected by staff turnover, which could easily be 25–35 per cent annually; elsewhere, recruitment of local students was easier and housing costs were lower. The NHS Plan (2000) recognised the shortage and promised 20,000 more nurses by 2005. To improve recruitment, the 1999 pay award for newly qualified staff was 12.5 per cent, far above the level of inflation. Pay awards continued to be comparatively generous to nurses, annually and through regrading exercises. Extra payments were made to nurses in London and the southeast, where problems were worst. Nurses took on roles previously performed by doctors; support workers picked up work previously undertaken by nurses; and the number of support workers (health care assistants, nursing auxiliaries and scientific support staff) grew – by 2006, there were nearly 290,000. Professional training and regulation became significant; some had little training, others had National Vocational Qualifications, though some nursing assistants were in fact qualified nurses who preferred the more ‘hands-on’ role of the nursing assistant. Frequently they substituted for trained nurses, undertaking responsible and complex nursing work. In 2000, the RCN voted to admit trained health care assistants to its ranks. Accommodation was a problem and Labour appointed a ‘tzar’ to stimulate its provision. Nurses, making up 70 per cent of the workforce and costing up to 35 per cent of a Trust’s budget, were inevitably under cost scrutiny. Trusts might not have the money to recruit staff even if they were available, and Professor Alan Maynard questioned whether higher salaries would accomplish much, when many of the problems of recruitment stemmed from nurses’ experience on under-staffed wards and the characteristics of many nursing jobs. 

The costly use of agency nurses increased. The Audit Commission said that, in 1999–2000, the NHS in England spent over £790 million on temporary nursing staff, 20 per cent more than the previous year. They were an expensive but an essential part of the workforce. With six weeks’ annual leave and rostering systems that meant that nurses might only work 14 days per month, the opportunities to put in additional hours of work for an agency, increasing one’s income, were considerable.

Nurse movement between countries

A nursing qualification is a passport to travel. By 2001, some 5,000 British nurses annually were applying for jobs overseas, and the NHS remained substantially and increasingly dependent on nurses who had trained overseas. The Department of Health established a website to encourage recruitment from India, Spain and the Philippines. Requests from overseas nurses to register with the NMC rose rapidly. Most came from non-EU countries, particularly the Philippines, but also South Africa, Australia, New Zealand and the West Indies. The Philippines trained more nurses than it needed (a government policy, as remittance income is a key source of economic growth), had a rigorous US-style four-year degree course, and it proved possible to recruit substantial numbers of hard-working and responsible staff members. Patients liked them. Overseas nurses might work as ancillaries while they adapted to British ways. Some were on temporary visas; others planned to stay permanently. Almost a third worked in the inner London area. Countries such as South Africa and Zimbabwe could ill afford the loss of professional staff trained at substantial cost. In 2001, over 2,000 nurses were recruited from sub-Saharan Africa where 20–30 per cent of the population were HIV positive. UK hospitals soon found themselves treating their own nurses.

Percentage of New UK-trained and Foreign-trained nurses added to the UKCC/NMC register 1989/90 to 201/02

Statistics showed: a steady increase in numbers on the register, and of UK-trained nurses, to the highest number ever; rapidly rising levels of overseas-trained nurses and midwives coming onto the register; men now represented more than one in ten of those on the register for the first time; the continued long-term trend of an ageing workforce; and increased mobility between countries.

Nursing practice

Hospital nursing had changed radically as patients were admitted and discharged more rapidly and treatment was of a complexity undreamt of by the founders of the NHS. Nurses were pursuing new pathways and nursing was far less homogenous with multiple clinical specialties; some were managers, undertaking triage, canulating and administering intravenous fluids or diagnosing illness. Developments such as NHS Direct and walk-in centres were based on nurses rather than doctors as the first point of contact. There was increasing acceptance of the nurse-practitioner in England, 20 years behind the USA. Nurses were effective in many roles, for example, consultations about predominantly minor illness general practice and supporting patients with long-term conditions. There was a major expansion of nurse prescribing and, in the 1990s, community nurses could prescribe independently from a limited formulary. From May 2006, independent nurse prescribers in England were able to prescribe any licensed medicine for any medical condition within their competence. Courses of several weeks for independent nurse prescribers were established.

Hospital clinical nurse specialists were on the increase. Some took over functions traditionally undertaken by hospital junior staff or family doctors, pre-admission clinics, minor injury services, emergency psychiatric assessment or the co-ordination of termination of pregnancy. The European Working Time Directive that limited the time junior doctors could spend on duty led many hospitals to look at the possibility of nurse-substitution. The nurses involved had often spent ten or more years in their profession, and many years in their particular specialty. Government introduced the idea of the ‘consultant’ nurse, few in number but paid on a significantly higher scale, breaking down the division between the role of the doctor andthe nurse. In some fields nurses increasingly led services, admitted and discharged patients, and made autonomous clinical decisions, organising programmes of care. Mental illness was one of these. The largest group were Macmillan nurses, providing palliative care, followed by specialists in diabetes, asthma, stoma wound care, infection control, and AIDS.

Too posh to wash

The rising profile of hospital-acquired infection was a challenge. Liquid soaps appeared outside the wards; patients were encouraged to complain if nurses did not wash their hands. Yet the quality of basic patient care was sometimes poor. A report by the Healthcare Commission following two outbreaks of Clostridium difficile in which at least 90 patients died, showed dirt, disgusting conditions in wards and toilets, with patients being asked to use their bed rather than a bedpan. In many hospitals, patients were often either too sick to eat or feed and wash themselves.156 The recruitment of staff from an aggressive society where the love of one’s brother was not always evident, created a new dynamic in the wards, mitigated by nurses from gentler cultures overseas. Many nurses still delivered exemplary care but it was distressingly clear, as in Stafford, to the elderly or their relatives that basic levels of care were often not provided. Some nurses seemed to believe that the caring aspect of nurses’ roles should be devolved to health care assistants so registered nurses could concentrate on treatment and technical nursing. Beverly Malone, General Secretary of the RCN, told the College’s 2004 conference that “the argument that you are too posh to wash is ridiculous. A nurse who doesn’t want to provide basic care has missed what an important part this plays in nursing. When bathing a patient, nurses are also assessing them, checking their breathing and emotional wellbeing.”157

The conscientious nurse faced massive problems. Staff shortage, because it was difficult to recruit, even if the budget was there, lowered staff morale on the wards. This resulted in nurses leaving the profession for other, less stressed and better paid, jobs – a vicious circle. How could staff levels that pushed staff beyond their limits of stamina and compassion be condoned? How could nurses, who had received what was planned as a rigorous and systematic education, be party to such poor quality of care? Two-thirds of hospital beds were now occupied by people over 65. The Standing Nursing and Midwifery Advisory Committee, reporting in 2001 on Caring for Older People, found major problems.158

Studies suggest that there are deficits in the core nursing skills required to meet the needs of older patients. Too many nurses see fundamental skills, such as bathing, helping patients to the toilet and assisting with feeding as tasks that can be delegated to junior or untrained staff. The emphasis on qualified staff being involved in patients’ activities of daily living may have shifted as other aspects of the nursing role, such as technical and managerial components have developed. But skilled nursing care cannot be delivered from a distance or through agents. It is a ‘hands-on’ activity …. The rapid expansion of specialties within nursing and the developing role of the allied health professions, e.g. physiotherapy, occupational therapy and dietetics, mean that several separate professional groups are now responsible for aspects of care, such as nutrition, that were previously nursing domains. There are also a large and growing number of nursing specialties, such as tissue viability, continence and infection control, whose areas of expertise overlap with traditional nursing practice. Increasing specialisation may have had the unintended detrimental effect of de-skilling adult nurses.

‘Care Pathways’ were developed as a way of systematising the treatment patients received, for example, breast cancer, building on the long-standing nursing procedures.

Nursing uniforms remained a vexed issue. Few nurses now had ever seen the traditional uniform, except in films. Trouser suits or clothing appropriate to an operating theatre (particularly in intensive care units) was becoming the norm. Academics wrote of the traditional wear as a badge of servitude, akin to the domestic dress of the nineteenth century servant. The uniform was redolent of a class and power structure in society, and should therefore be opposed. Some hospital wards experimented with the total abolition of the nursing uniform. Nurses might like this, although it made it harder for patients to identify who was, and who was not, a nurse. Other service industries, placing an accent on ‘customer care’ had uniformed staff, but nurses in the NHS were moving in the opposite direction. Cross-infection appeared to be a serious problem; few hospitals now had laundries where uniforms were washed at high temperatures, reducing bacterial contamination. Nurses usually washed them at a lower temperature at home, and might wear them to and from work for several days. A survey at Southmead Hospital showed that more than a third of nurses’ uniforms were contaminated by significant organisms before going on duty.159

Nursing administration

After the introduction of the general management function in 1983, nurse influence on management lessened. Few hospitals now had a senior nurse who could be seen as a role model, a champion from whom leadership could be expected. The NHS Plan recognised a need for a new generation of managerial and clinical leaders, including ‘modern matrons’ with authority to get the basics right on the ward. The plan provided additional money to help to get the wards cleaner. Ward housekeepers were introduced with ward environment budgets under the control of sisters and charge nurses.

‘Modern matrons’ would be visible, with the authority to get things done, lead the nursing team in groups of wards, demonstrate to other nurses the high standards NHS patients should expect, make sure patients received quality care, and that cleaning and food standards were met. They would oversee the spending of ward budgets and resolve problems for patients. They might be concerned with infection control, and might have the power to order tests, admit and discharge patients, run clinics, triage patients and, where appropriate, prescribe medicines. Christine Hancock, as RCN General Secretary, said, “Patients have been crying out for someone they know to be in charge on hospital wards.” The number of ‘nurse consultants’ or ‘modern matrons’ grew to over 1,000 in 2007.

The condition of the NHS

For ten years Labour had set the agenda and, paradoxically, the private sector became involved in NHS provision as never before. The problem of long waiting times for treatment was largely solved, public satisfaction with the NHS increased, but there was now a perception that there was poor access to the family doctor service, and that hospital infection was out of control. In basic research, nanotechnology, stem cell research and genetic medicine were forging ahead, supplemented by rapid advance in pharmaceuticals and imaging technology.. Patient expectations continued to rise. Each expensive new technology raised costs and increased demand as treatment became more effective and less traumatic. Government faced the dilemma of reconciling national standards because of parliamentary accountability, with the need to decentralise decision-making. Spending on the health service had risen dramatically and the pay of doctors, nurses and managers was much higher. Hospital staffing had never been so substantial, patient turnover was more rapid, waiting lists had improved greatly, treatments ever more sophisticated, family doctors had given up nights on call, and nurses were carving out new careers. Compared with the situation they had inherited, the NHS did appear “to have been saved” but problems remained.. Financial crises were building up. The cover provided out of hours had deteriorated as a result of the opting out of GPs. Health inequalities persisted nationally and internationally. Labour’s popularity was falling and there was increased interest in the health policies of the Conservatives.

Back to top
1.

Moore W. The impossible dream. Health Service Journal 2000: 6 January: 8–9.

2.

Dixon J. [Editorial], BMJ 2008,336; 844-5

3.

Data Remember: Improving the quality of patient-based information in the NHS. Audit Commission. London. 2002.

4.

Parliament Information for health: an information strategy for the modern NHS 1998-2005. DoH: London; 1998.

5.

DoH. Delivering 21st century IT support for the NHS: national strategic programme. DoH. London. 2002.

6.

Benson T, BMJ 2002: 325,1066-9 &1090-93

7.

DoH. The New NHS - Modern, Dependable, Cm 3807, London, 1997

8.

The NHS Plan: a plan for investment, a plan for reform. Cm 4818-I, London, 2000.

9.

Shifting the balance of power within the NHS: Securing delivery. DoH. London. 2001.

10.

Delivering the NHS Plan: next steps on investment, next steps on reform. Cm 5503. DoH. London. 2002.

11.

Creating a patient-led NHS: Delivering the NHS Improvement Plan. DoH London. 2005.

12.

Parliament. Our Health, Our Care, Our Say - Community Care, a new direction for community services. Cm 6737. London. HMSO. 2006.

13.

Parliament. High Quality Care For All. NHS Next Stage Review Final Report. CM 7432 London. 2008.

14.

Benson T, BMJ 2002: 325,1066–9 &1090–93

15.

Enthoven Alain. In pursuit of an improving national health service. 1999 Rock Carling Fellowship. Nuffield Trust. London. 1999.

16.

DoH. The New NHS - Modern, Dependable, Cm 3807, London, 1997

17.

Dixon J. This is as good as it gets. [Editorial] BMJ 2000;321:315

18.

The report of the public inquiry into children’s heart surgery at the Bristol Royal Infirmary 1984–1995: Learning from Bristol. (Chair Professor Ian Kennedy) Cm 5207. London. The Stationary Office. 2001

19.

Health Service Journal 15 November 2001

20.

Shifting the balance of power within the NHS: Securing delivery. DoH. London. 2001.

21.

Degeling P et al, Medicine, management, and modernisation: a “dance macabre”? BMJ 2003; 326: 649-52

22.

Health Service Journal supplement 9 September 2004

23.

Department of Health. The NHS Improvement Plan : Putting people at the heart of public services. Cm 6268. Stationary Office. London. 2004.

24.

Mooney H & McLellan A, HSJ 2003, 9 October, 12-3 Department of Health. Building on the Best; Choice, Responsiveness and Equity in the NHS. London. 2003.

25.

Building on the best: Choice, responsiveness and equity in the NHS - response document. DH: London; 2003

26.

Greer SL, Roland D. Devolving policy, diverging values? Nuffield Trust. London. 2007

27.

Klein R. BMJ 2007;335:2-3 (7 July)

28.

A guide to NHS foundation trusts. DOH; London: 2002.

29.

Parliament. Our Health, Our Care, Our Say - Community Care, a new direction for community services. Cm 6737. London. HMSO. 2006.

30.

DoH Creating a patient-led NHS: Delivering the NHS Improvement Plan DH/NHSLondon. 2005

31.

DH Health reform in England: Update and next steps Cm 6268 DH; Leeds: 2005

32.

Audit Commission/Healthcare Commission. Is the treatment working? Progress with the NHS system reform programme. London. 2008

33.

Healthcare for London. A Framework for Action. NHS London. London. 2007

34.

Darzi A. Our NHS our future: NHS next stage review - interim report. London: Department of Health; 2007.

35.

DH. Our NHS, Our Future. NHS Next Stage Review - Leading Local Change DH; London: 2008.

36.

DH. Best Research for Best Health - Introducing a new national health research strategy. DH; London: 2006

37.

Nuffield Trust. An independent NHS: a review of the options. Report by Professor Brian Edwards. London: Nuffield Trust, 2007

38.

Timmins N, Farewell to dodging and weaving. BMJ 2007;334:877

39.

Breakfast with Frost. BBC1 16th January 2000.

40.

Derek Wanless. Securing our future health: taking a long-term view - the Wanless Report. HM Treasury. London. 2002.

41.

No cash to implement NICE, health authorities tell MPs. BMJ 2002: 324; 258

42.

Nicholson, D. The Wisdom of the Crowd. Nuffield Trust, 2013.

43.

Derek Wanless, Securing good health for the whole population: Final report - February 2004. London: 2004; HM Treasury

44.

DH. Reforming NHS financial flows: payment by results. DH. London. 2002.

45.

HSJ 2005, 14 July, p5

46.

We Can Do Better — Improving the Health of the American People, Steven A. Schroeder, NEJM 2007, 357; 1221-1228

47.

Independent Inquiry into Inequalities in Health Report, Chairman: Sir Donald Acheson. London 1998. Stationary Office

48.

DH. The health of the nation: a strategy for health in England. Cm 1986. London: HMSO, 1992

49.

Department of Health. Saving lives: Our Healthier Nation. Cm 4386 London: 1998; Stationery Office

50.

Tackling Health Inequalities – a programme for action. London: 2003; Department of Health

51.

Derek Wanless, Securing good health for the whole population: Final report - February 2004. London: 2004; HM Treasury

52.

Department of Health. Choosing Health – Making healthy choices easier. Cm 6374 London: 2004; HMSO

53.

Schuster MA et al, Millbank Quarterly 1998; 76: 517-63

54.

Institute of Medicine. Kohn LT, ed. Corrigan JM, ed. Donaldson MS, ed. To Err Is Human: Building a Safer Health System. Washington, DC: National Academy Press; 1999

55.

Committee on Quality of Health Care in America. Crossing the Chasm. Washington, DC: National Academy Press; 2001

56.

Williams SC, Schmaltz S, Quality of Care in U.S. Hospitals as Reflected by Standardized Measures, 2002–2004 NEJM 2005 353: 255-264.

57.

Leape and Berwick D. Five Years After To Err Is Human: What Have We Learned? JAMA.2005: 293: 2384-2390

58.

Quality of medical care delivered to Medicare beneficiaries: A profile at state and national levels. Jencks SF, Cuerdon T, Burwen DR, Fleming B, Houck PM, Kussmaul AE, Nilasena DS, Ordin DL, Arday DR. JAMA. 2000 Oct 4;284(13):1670-6

59.

Audit Commission. A Spoonful of Sugar – medicines management in NHS hospitals. London. 2001

60.

Department of Health. An organisation with a memory. Report of an expert group on learning from adverse events in the NHS. London. 2000. Stationary Office.

61.

Leatherman S & Sutherland K. The quest for quality in the NHS: a mid-term evaluation of the ten-year quality agenda. London, Nuffield Trust, 2003

62.

Aiken LH, Clarke SP, Sloane DM, Sochalski J, Silber JH. Hospital nurse staffing and patient mortality, nurse burnout, and job dissatisfaction. JAMA. 2002;288:1987-1993

63.

Jarman B et al. Explaining differences in English hospital death rates using routinely collected data. BMJ 1999; 318: 1515

64.

The report of the public inquiry into children’s heart surgery at the Bristol Royal Infirmary 1984-1995: learning from Bristol (Chairman Professor Ian Kennedy) London. 2001

65.

One Bristol, but there could have been many [editorial] BMJ 2001;323:179

66.

Parliament. The Shipman Inquiry, fifth report. Safeguarding Patients: Lessons from the Past – Proposals for the Future Cm Dame Janet Smith DBE 6394-I 2004 London;

67.

Walshe K, Benson L. Time for radical reform. BMJ 2005;330: 1504-6

68.

Scally G and Donaldson L. Clinical governance and the drive for quality improvement BMJ 1998; 317: 61-5

69.

Department of Health. A First Class Service: Quality in the New NHS. 1998. London: Department of Health.

70.

Department of Health. Building A Safer NHS For Patients ; promoting patient safety following An Organisation with a Memory. 2001. Department of Health. London.

71.

Parliament. The New NHS – Modern, Dependable. Cm 3807 Department of Health. 1997, London. Stationary Office.

72.

Bosanquet N, Health Service Journal, 19 June 2003, p 30–1

73.

Drazen J M, New England Journal of Med. 17 March 2005

74.

Department of Health. Getting Ahead of the Curve: A Strategy for Combating Infections Diseases London: Department of Health. 2002.

75.

Standing Medical Advisory Committee Sub-group on Antimicrobial Resistance. The Path of Least Resistance. Department of Health, 1998.

76.

Johns Hopkins Medical Institutions. The Antibiotic Guide (ABX Guide), at www.hopkins-abxguide.org,

77.

Adler M. Sexual health. BMJ. 2003; 362: 62-63

78.

Parliament. The BSE Inquiry Report ; London: 2000 The Phillips report on BSE and vCJD. Lancet, Volume 356, 9241, 1535,

79.

Wakefield A J et al. Ileal-lymphoid-nodular hyperplasia. Lancet 1998, 351. 637-41. This paper was later retracted

80.

Health Service Journal 25 May 2006

81.

Department of Health. Our inheritance, our future: realising the potential of genetics in the NHS. White Paper. 2003. CM 5791-I. Stationary Office.

82.

Wald NJ, Law MR. A strategy to reduce cardiovascular disease by more than 80%. BMJ. 2003 Jun 28;326(7404):1419.

83.

The potential impact of an opt out system for organ donation in the UK: an independent report from the Organ Donation Taskforce. DH. London. 2008.

84.

Fox A J and Rowbotham D J. Recent Advances in Anaesthesia BMJ 1999;319:557

85.

Transforming Emergency Care in England. A report by Professor Sir George Alberti. DH. London. 2004. And Emergency access - Clinical case for change: Report by Sir George Alberti, DH. London. 2006.

86.

National Stroke Strategy. DH. London. 2007.

87.

The NHS Cancer plan: a plan for investment, a plan for reform DH. London. 2000

88.

Commission for Health Improvement. NHS Cancer Care in England and Wales. CHI. 2001

89.

Department of Health & Welsh Office. A Framework for Commissioning Cancer Services. 1995.

90.

Department of Health. Cancer Reform Strategy. 2007.

91.

Why Mothers Die 2000–2002. RCOG. London. 2004.

92.

: Lewis, G (ed) 2007. The Confidential Enquiry into Maternal and Child Health (CEMACH). Saving Mothers’ Lives. - 2003-2005. The Seventh Report on Confidential Enquiries into Maternal Deaths in the United Kingdom. London: CEMACH.

93.

Department of Health. Maternity Matters. – Choice, access and continuity of care. London. 2007

94.

Risks and benefits of estrogen plus progestin in healthy postmenopausal women: JAMA. 2002;288:321-333

95.

Beral V and Million Women Study Collaborators. Breast cancer and hormone-replacement therapy in the Million Women Study. Lancet 2003; 362: 419-27

96.

BMJ 2003;327:767.

97.

Ministry of Health (1959) The Welfare of Children in Hospital, Platt Report. London: Her Majesty’s Stationery Office. Fit for the future: report of the Committee on Child Health Services ( chairman, S.D.M. Court.) 1976.London HMSO

98.

National service framework: children, young people and maternity services. 2004: London; HMSO.

99.

Health Advisory Service. Not because they are Old. An independent inquiry into the care of older people on acute wards in general hospitals. 2000. London: HAS.

100.

Standing Nursing and Midwifery Advisory Committee. Caring for older people - a nursing priority: London; 2001; Department of Health.

101.

National service framework: older people. DH. London. 2001.

102.

Parliament. With Respect to Old Age: Long Term Care - Rights and Responsibilities. A Report by The Royal Commission on Long Term Care. (chair Professor Sir Stewart Sutherland) Cm 4192-I. Stationery Office. London. 1999.

103.

Department of Health. National service framework: mental health 1999. London HMSO

104.

Muijen M, HSJ 2003, 9 Oct, 18-19.

105.

Department of Health. Modernising mental health services safe, sound and supportive. 1998.

106.

Department of Health. Reforming The Mental Health Act 2001. Cm 5016-I London. Stationary Office.

107.

Department of Health. Building on the best: Choice, responsiveness and equity in the NHS (response to consultation). CM 6079. 2003; London. Department of Health

108.

Salisbury C et al. What is the role of walk-in centres in the NHS? BMJ 2002; 324: 399-402; Grant C, Nicholas, R, Moore L, Salisbury C, An observational study comparing quality of care in walk-in centres with general practice and NHS Direct using standardised patients. BMJ 2002;324:1556.

109.

Parliament. Pharmacy in England, Building on strengths – delivering the future. 2008. London. Department of Health.

110.

Parliament. The NHS Plan: a plan for investment, a plan for reform, Cm 4818-I 2000 ; London: Department of Health

111.

Nocon A & Leese B, BJGP 2004, 54, 50-56.

112.

Department of Health. Our health, our care, our say: a new direction for community services. 2006: DH; London

113.

Royal College of General Practitioners. The future direction of general practice, a road map. 2007; RCGP :London

114.

Department of Health. NHS Next Stage Review. Our vision for primary and community care. 2008: DH. London.

115.

NHS London The Case for Change – a framework for action. 2007 London

116.

Imison C et al. Under one roof. Will polyclinics deliver integrated care? 2008: King’s Fund; London

117.

The Times 23 February 2008

118.

Pollock A et al. BMJ 2007;335:475-477.

119.

National Audit Office. NHS Pay Modernisation: New Contracts for General Practice Services in England. 2008. London

120.

Ray Robinson, HSJ, 3 July 2003, 18-19

121.

The National Beds Inquiry. DH. London. 2000

122.

NHS Confederation Report: Why We Need Fewer Hospital Beds? London. 2006.

123.

NHS. Hospital Plan for England and Wales. Cmnd 1604 London. HMSO. 1962.

124.

DHSS & Welsh Office. Central Health Services Council;. The functions of the District General Hospital. (Chair Sir Desmond Bonham-Carter. London. HMSO. 1969.

125.

Joint Working Party of the British Medical Association, The Royal College of Surgeons, The Royal College of Physicians. Provision of general hospital services. London: BMA, RCS, RCP, 1998

126.

Acute Health Services. Report of a working party. Academy of Medical Royal Colleges London. 2007.

127.

House of Commons Health Committee. Independent Sector Treatment Centres 2005–06. London. Stationary Office. 2006.

128.

Pollock Allyson M, Shaoul Jean, Vickers Neil. Private finance and “value for money” in NHS hospitals. BMJ 2002;324:1205-1209 Pollock AM, Price D. The private finance initiative: the gift that goes on taking. BMJ 2010; 341:c7175

129.

Palmer. Reconfiguring Hospital Services, Lessons from South East London, London: The King’s Fund 2011.

130.

Audit Commission. Achieving the NHS plan. Audit Commission. London 2003

131.

The Times, May 12, 2003, p 4

132.

DoH. HR in the NHS Plan: more staff working differently. London. 2002.

133.

Davis A, Bristow A. A recipe for quality. Nuffield Trust. London. 1999.

134.

Health Service Journal supplement 9 September 2004

135.

McKee M, Belcher P. BMJ 2008;337:a610

136.

Turnberg L. London Strategic Review. Independent Advisory Panel;. Letter to the Secretary of State. 1997.

137.

Healthcare for London. The case for change. NHS London. London. 2007.

138.

Darzi A. Healthcare for London: a framework for action London: Healthcare for London, 2007.

139.

Parliament. Royal Commission on the National Health Service (Chairman Sit Alex Merrison). Cmnd 7615. 1979: HMSO: London.

140.

Medical Workforce Standing Advisory Committee. Planning the medical workforce. Third report, London: Department of Health, 1997.

141.

Smee C, Speaking Truth to Power, Nuffield Trust 2005: London

142.

HM Treasury. Derek Wanless. ‘Securing Our Future Health: Taking A Long-Term View’ HMSO; 2002: London.

143.

DoH. Medical schools: delivering the doctors of the future. DoH Publications. London. 2004.

144.

General Medical Council. Tomorrow’s Doctors. GMC; London; 1993

145.

BMA Education committee. The demography of medical schools – a discussion paper. BMA: London; 2004.

146.

Independent 2 August 2004

147.

House of Commons Health Committee. Modernising Medical Careers. Third Report of Session 2007–08. Parliament. London. 2008

148.

Tooke J. Aspiring to excellence: findings and recommendations of the independent inquiry into Modernising Medical Careers. London: MMC Inquiry, 2007

149.

House of Commons Health Committee. Modernising Medical Careers .Third Report of Session 2007–08. Parliament. London. 2008.

150.

BMJ 2003;327:961-964

151.

Department of Health. Making a Difference: strengthening the nursing, midwifery and health visiting contribution to health and healthcare 2009: London

152.

Davies C, From Conception to Birth, Nuffield Trust 2002: London

153.

United Kingdom Central Council for Nursing, Midwifery and Health Visiting (1999) Fitness for practice. 1999 :UKCC; London.

154.

DoH Modernising Nursing Careers – Setting the Direction. 2006:Department of Health. London.

155.

Aiken LH et al. Hospital nurse staffing and patient mortality. JAMA. 2002;288:1987-1993

156.

Healthcare Commission. Investigation into outbreaks of Clostridium difficile at Maidstone and Tunbridge Wells NHS Trust. HCA: 2007; London .

157.

The Guardian, Tuesday 11 May 2004

158.

Department of Health, Standing Nursing and Midwifery Advisory Committee. Caring for older people - a nursing priority: Integrating knowledge, practice and values. 2001. London.

159.

Perry C; Marshall R, Jones E. Journal of Hospital Infection, Volume 48, Issue 3, July 2001, 238-241