Chronology: the first decade
1948
Background
Railways and electricity nationalised
State of Israel proclaimed
Marshall Plan
Transistor invented
Berlin airlift
Empire Windrush arrives
New town designation starts
NHS events
5 July 1948 NHS established
Development of specialist services: RHB(48)1
Medical Research Council (MRC) Social Medicine Research Unit (Central Middlesex)
1949
Background
Pound devalued to $2.80
NHS events
Powers to introduce prescription charges
Aureomycin, chloromycetin, streptomycin/PAS
Antihistamines
Cortisone and ACTH
Vitamin B12
Nurses Act creates regional nurse training committees
1950
Background
Korean war (1950–54)
NHS events
Link between smoking and lung cancer
Ceiling on NHS expenditure imposed
Bradbeer Committee appointed on internal administration of hospitals
Collings Report on general practice
1951
Background
Election: Conservative victory
Festival of Britain
NHS events
John Bowlby’s Maternal and child health care
Helicopters used in casualty evacuation in Korea
1952
Background
Death of King George VI
Harrow rail disaster
Barcode invented
NHS events
Danckwerts award for GPs
Watson and Crick establish the double helical structure of DNA
Chlorpromazine
London fog – thousands of deaths from pollution
College of General Practitioners formed
Confidential Enquiry into Maternal Death
1953
Background
Korean armistice
Elizabeth II crowned
Everest climbed
NHS events
Nuffield report on the work of nurses in hospital wards
Gibbon uses a heart-lung machine in heart surgery
Jerry Morris study of heart disease related to activity
1954
Background
Food rationing ends
First business computer (IBM)
NHS events
Cohen Committee on general practice
First kidney transplant (identical twin)
Daily visiting of children in hospital encouraged
Bradbeer Report
Beaver Committee on air pollution reports
1955
Background
Credit squeeze
Election: Conservative victory
Independent Television launched
NHS events
Acton Society Trust papers on NHS
Ultrasound in obstetrics
Group practice loan funds
1956
Background
Suez crisis
Hungarian rising
Dr John Bodkin Adams arrested
NHS events
Polio immunisation
Clean Air Act
Guillebaud: Cost of the NHS
Large-scale trial of birth control pills
Working Party on health visiting (Jameson)
1957
Background
Macmillan Prime Minister
First satellites, Sputnik I and II
Royal Commission on Mental Illness reported
Treaty of Rome
NHS events
Willink report on future number of doctors
Royal Commission on doctors’ pay announced
Hospitals to complete the Hospital Inpatient Enquiry
Percy Commission on Mental Illness
On 5th July we start together, the new National Health Service. It has not had an altogether trouble-free gestation! There have been understandable anxieties, inevitable in so great and novel an undertaking. Nor will there be overnight any miraculous removal of our more serious shortages of nurses and others and of modern replanned buildings and equipment. But the sooner we start, the sooner we can try together to see to these things and to secure the improvements we all want . . . My job is to give you all the facilities, resources and help I can, and then to leave you alone as professional men and women to use your skill and judgement without hindrance. Let us try to develop that partnership from now on.
Message to the medical profession. Aneurin Bevan1
Preparing for the new service
For almost a century the government’s Chief Medical Officers (CMOs) had often begun their annual reports with an account of the year’s weather. It was a tradition going back to the Hippocratic view of its effect on health. Sir Wilson Jameson described the problems of 1947, the year before the NHS began.2
The eighth year of austerity, 1947, was a testing year. Its first three months formed a winter of exceptional severity, which had to be endured by a people who in addition to rationing of food were faced with an unprecedented scarcity of fuel. These three months of snow and bitter cold were followed by the heaviest floods for 53 years, which did great damage, killed thousands of sheep and lambs, delayed spring sowing and threatened the prospect of a good harvest which was so urgently needed. Immediately after these four months of disastrous weather there followed a period of economic crisis with an ever-increasing dollar crisis. So acute was the crisis that restrictions more rigorous than any in the war years became necessary. Bread had to be rationed for the first time late in 1946; in September 1947, the meat ration was reduced; in October the bacon ration was halved; and in November potatoes were rationed. A steep rise in the prices of foodstuffs and cattle food followed disappointing harvests in many European countries, due to the hard winter and hot dry summer, and in certain crops, notably corn for animal food, in America. Affairs abroad were as depressing as conditions at home.
The second world war had created a housing crisis. Alongside post-war rebuilding of existing cities, and the designation of overspill areas, the New Towns Act 1946 led to major new centres of population. The boundaries were drawn generously, land reclamation figured prominently and the problems of high-rise living were avoided. Most were clustered in the southeast. The planners covered thousands of acres of farmland, but they avoided tower blocks and the devastating results of the simultaneous redevelopment of the centres of older towns. The ethos and the pattern of the NHS had much in common with the newly nationalised state industries, railways, steel and the utilities. Beveridge, in his report in 1942, had proposed state funding but not how the NHS should work in practice.3 In 1944, before victory in Europe had been achieved, a committee within the Ministry had considered how the emergency hospital service would be ‘demobilised’. Bevan had worked out the details and the NHS had a command structure, a ‘welfare state’ ideology and was heavily dominated by those providing the services. On the appointed day, 1,143 voluntary hospitals with some 90,000 beds and 1,545 municipal hospitals with about 390,000 beds were taken over by the NHS in England and Wales. Of the ex-municipal beds, 190,000 were in mental illness and mental deficiency hospitals. In addition, 66,000 beds were still administered under Public Assistance, mainly occupied by elderly people who were often not sick in the sense of needing health care. Among the residents were some with irrecoverable mental illness, with a generous addition of ‘mental defectives’ and many old people who would now be regarded as having geriatric problems.
Designation of new towns
Crawley |
1947 |
Hemel Hempstead |
1947 |
Harlow |
1947 |
Newton Aycliffe and Peterlee |
1947 |
Welwyn Garden City and Hatfield |
1948 |
Basildon |
1949 |
Bracknell |
1949 |
Corby |
1950 |
Source: The Times 11 October 1996
Just before the service started, Aneurin Bevan sent a message to the medical profession. He spoke of the profession’s worries about discouragement of professional freedom and worsening of a doctor’s material livelihood – and said if there were problems they could easily be put right. He referred to a sense of real professional opportunity. In the same issue of the British Medical Journal (BMJ) on 3 July 1948, the editor was not so sanguine. While seeing the logic of spreading the high cost of illness over the whole of the community, it saw dangers in a state medical service, dogma, timidity, lack of incentive, administrative hypertrophy, stereotyped procedure and lack of intellectual freedom. However, much had been gained in negotiation over the previous months and now the medical profession would co-operate with the government. There was an opportunity to mould the service in partnership with the Ministry of Health. The service would have to evolve and there would be much trial and error. But the opportunity of building a healthy Britain would be grasped eagerly. “The pattern of events was clear,” said the BMJ, “The medical man, intensely individual, was becoming more and more aware of his responsibility to the community”.
Additional resources were negligible. The appointed day, 5 July 1948, brought not one extra doctor or nurse. What it did was change the way in which people could obtain and pay for care. They ceased to pay for medical attention when they needed it, and paid instead, as taxpayers, collectively. The NHS improved accessibility and distributed what there was more fairly. It made rational development possible, for the hierarchical system of command and control enabled the examination of issues such as equity.4 The Times pointed out that the masses had joined the middle classes. Doctors had become social servants in a much fuller sense. It was now difficult for them to stand aside from their patients’ social difficulties or to work in isolation from the social services.5 The Ministry, having worked for the establishment of the NHS, now became passive.
In making allocations to the regional hospital boards (RHBs) the Ministry of Health worked from what had been spent in the previous year. The boards took major decisions without fuss. Ahead of them lay the task of ‘regionalisation’, the development and integration of specialist practice into a coherent whole.6 Many reports were to hand, including the Hospital Surveys and the Goodenough Report on medical education.7 Bevan held a small dinner party on the first anniversary of the service to thank those who had been concerned with the preparatory stages. He toasted the NHS, and coupled the NHS with the name of Sir Wilson Jameson.
NHS managing bodies, 1948
- 14 regional hospital boards (RHBs)
- 36 boards of governors for teaching hospitals (BGs)
- 388 hospital management committees (HMCs)
- 138 executive councils (ECs)
- 147 local health authorities (LHAs)
There was uncertainty about who was in charge at region level. In most regions there was a viable partnership with no single boss. The senior administrative medical officer (SAMO) was university educated, but this was not necessarily true of the secretary, who drew a lower salary. Regional organisation varied and could be complex. In April 1956, Sheffield RHB had seven standing committees, six standing subcommittees, some chairman’s and many other advisory committees, 23 committees of consultants and a nursing advisory committee. There were also nine special committees, five ad hoc building committees, liaison committees with teaching hospitals and the university, and joint committees with other authorities on matters such as the treatment of rheumatic disease. The East Anglian region was simplicity itself: its last remaining committee (finance) had ceased to meet and the board did everything! The subordinate hospital management committees (HMCs) ran the hospitals and sometimes started to rationalise their facilities, but they had little influence on wider issues. Power increasingly lay at the RHB.
The Central Health Services Council
Standing advisory committees
The standing advisory committees remained in existence for over 50 years. There were four, each statutory and uni-professional: the Standing Medical Advisory Committee (SMAC) and its equivalents for nursing and midwifery (SNMAC), pharmaceutical services (SPAC) and dentistry (SDAC). They advised ministers in England and Wales when requested but also ‘as they saw fit’. Members were appointed by the Minister from nominations by the professions and included the presidents of the Royal Colleges. Their precise role changed over the years; initially they prepared guidelines on general clinical problems, usually through subcommittees.
The Central Health Services Council (CHSC), constituted by the 1946 NHS Act, was the normal advisory mechanism for the Ministry of Health. It had a substantial professional component alongside members representative of local government and hospital management.8 It was large and, after the first few years, met only quarterly, although several of its subcommittees remained influential. The Lancet believed that the Ministry never encouraged the CHSC to be a creative force. In its first 18 months, a host of novel and difficult problems faced the service and Bevan remitted 30 questions to it. He received advice from the Council on these and 12 other topics. At its first meeting, a committee was established to examine hospital administration, chaired for most of its existence by Alderman Bradbeer from Birmingham. Other issues included the pressure on hospitals and emergency admissions, the care of the elderly chronic sick, the mental health service, wasteful prescribing in general practice, and co-operation between the three parts of the NHS. Ten standing committees were established, some exclusively professional, and others to examine specific services such as child health, and cancer and radiotherapy.9 Over the first 20 years of the NHS, they produced a series of major reports that altered clinical practice, for example, on cross-infection in hospitals, the welfare of children in hospital, and human relations in obstetrics. The main committees were SMAC and SNMAC. Henry Cohen chaired SMAC for the first 15 years of the NHS. A general physician from Liverpool, his intellectual gifts made it possible for him to remain a generalist at a time when specialisation was becoming the order of the day.10 To begin with there was anxiety in the Ministry that SMAC would prove an embarrassment in its demands, but soon the members had exhausted the issues about which they felt strongly. George Godber found it best to provide SMAC with background briefing on an emerging problem and only then to ask for its advice. The Ministry could not give doctors clinical advice, but SMAC could and did – for example, that when drugs were in the experimental stage, or scarce, they should be restricted to use in clinical trials. Later they should be available solely through designated centres, and only when they were proven and in unlimited supply should control be no more than that necessary in patients’ interests.
Professional and charitable organisations
The introduction of the NHS affected many organisations that had taken part in the debates preceding the NHS. The British Hospitals Association, which had represented the voluntary hospitals, ceased to have a role and was rapidly wound up. The British Medical Association (BMA) continued at the centre of serious medical politics. For historic reasons GPs had always been powerful within it; they were many and they provided much of its money. When in 1911 Lloyd George’s national health insurance gave working men a doctor, GPs had to become increasingly active. The GPs’ Insurance Acts Committee was continued after 1948 as the General Medical Services Committee (GMSC), a standing committee of the BMA with full powers to deal with all matters affecting NHS GPs. The local medical committees elected it, as panel committees had done previously. It was not until 1948 that consultants had to enter the medico-political arena, which was new and unfamiliar to them. The consultants formed the Central Consultants’ and Specialists’ Committee, with powers analogous to the GPs’ committee as far as terms of service were concerned. The Joint Consultants Committee (JCC) succeeded the earlier negotiating committee, federating the BMA and the medical Royal Colleges, and represented hospital doctors and dentists in discussions with the health departments on policy matters other than terms of service. This complex system did not make for unity of the medical profession, particularly on financial matters.
The three Royal Colleges maintained powerful positions as a source of expert opinion and also in political matters. Charles Moran, Winston Churchill’s personal physician, (known familiarly as Corkscrew Charlie), was President of the Royal College of Physicians (RCP) from 1941 to 1950. Alfred Webb Johnson led the Royal College of Surgeons (RCS), and their relationship was a little prickly. William Gilliatt, the Queen’s obstetrician, was President of the Royal College of Obstetricians and Gynaecologists (RCOG). As his college dated only from the twentieth century it was regarded as the junior partner. The colleges were London dominated, and their presidents were usually southern; Robert Platt was the first provincial President of the RCP. The RCS had been damaged in the war and there was a chance of getting a neighbouring site so that all three Royal Colleges could be rebuilt together. Alfred Webb Johnson had a vision of a medical area in Lincoln’s Inn Fields, perhaps grandiose but it could have created a broad-ranging academy of medicine and a chance to develop methods of reviewing clinical practice.11 Moran stopped it, fearing that the RCP would become subsidiary. The RCS continued to encourage its own sub-specialties to develop and form close links with the parent organisation.
King Edward VII’s Hospital Fund for London (King’s Fund) had previously provided about 10 per cent of the income of London voluntary hospitals, but the state now funded these. It began to look at new fields, for example, the training of ward sisters, and catering.12 The Nuffield Provincial Hospitals Trust had fought for regionalisation, the pattern of organisation Bevan had adopted. It rapidly developed into a think tank on health service matters, but neither the Fund nor the Trust could maintain their direct influence on policy, although they were valuable sources of expertise.
More informal groups had existed before the establishment of the NHS. Wilson Jameson had his ‘gas-bag’ committee at the London School of Hygiene & Tropical Medicine where he was Dean. The same institution spawned the Keppel Club, in which young doctors from many disciplines came together from 1953 to 1974.13 A small society with a tight membership, it was entirely apolitical and met monthly for freewheeling and uninhibited discussion. There was an opportunity to discuss new methods and systems at an intellectual level. Membership was by invitation, and included Brian Abel-Smith, John Brotherston, John Fry, Walter Holland, Jerry Morris, Michael Shepherd, Stephen Taylor, Richard Titmuss and Michael Warren. Until it ended in 1974, when its members were busier and more senior, the club discussed such issues as child health, the care of the adolescent and the aged, general practice, hospital services, mental illness and the collection of information in the NHS.
The Royal College of Nursing (RCN), founded in 1916 as an association to unite trained nurses, emerged as a powerful body now that all nurses were working for the NHS. A decision was taken to discourage membership of mental illness nurses, who stayed with the Confederation of Health Service Employees (COHSE). COHSE hoped to become the industrial union for the NHS but other unions recruited nurses (the RCN), ancillary workers (the National Union of Professional Employees and the Transport and General Workers Union), administrative staff (the National Association of Local Government Employees), and laboratory and professional staff (the Association of Scientific Workers, later the Association of Scientific, Technical and Managerial Staffs).14
National medical charities generally acted as pressure groups and they continued their work, now with the NHS in their sights. For example, there was the National Birthday Foundation that campaigned for the extension and improvement of maternity services, the National Association for Mental Health (Mind) promoting the interests of people with mental health problems, and the Association of Parents of Backward Children (later Mencap).
Medicine and the media
Newspaper and magazine articles on professional issues were uncommon. Medical authors were suspected of advertising, an offence for which they might be struck off the register. Doctors and nurses had mixed views about the media. Some believed that there would be widespread hypochondriasis if it was no longer possible to keep people in ignorance of hospital care and their treatment. Television was slowly spreading from London throughout the country, but as late as 1957, only half the households had a set, and among the professional classes there were even fewer. Educated people often talked about television without actually having seen it. ‘Emergency – Ward 10’, one of the earliest popular programmes, was thought to help nurse recruitment but was creating a modern mythology about nurses and hospital treatment.15 When BBC TV ran a programme on slimming and diet, the BMJ was alarmed by “this somewhat curious experiment that approached the public over the heads of the practising doctor”.16
Medical progress
Health promotion
Health education had been pursued during the years of war. The approach remained mass publicity on all fronts. Messages were didactic and concentrated on the dangers in the home, infectious disease, accident prevention and, in the 1950s, the diagnosis of cancer of the breast and cervix.17 There was little evidence that this technique, largely modelled on the advertising world, worked. Many doctors felt that the less patients knew about medicine the better, as Charles Fletcher, a physician at the Hammersmith Hospital, discovered to his cost when he advocated pamphlets for patients, explaining the causes of their illnesses and what to do about them.18 In 1951 the BMA launched a new popular magazine, Family Doctor. Primarily a health magazine, its aim was to present simple articles on how the body worked, the promotion of health, and the prevention of disease. The editor believed passionately that education and persuasion to adopt a different life style could improve the health of the nation. He felt that the time was past when medicine could be regarded as a mystery. Some subjects, however, were taboo, contraception being one of these.19
Jeremy Morris, (1910–2009), a life long socialist and epidemiologist at the London School of Hygiene & Tropical Medicine, laid foundations for the promotion of exercise as important for health. In a study with London Transport which lasted many years, he found that the drivers of London’s double-decker buses were more likely to die suddenly from “coronary thrombosis” than the conductors, and that government clerks suffered more often from rapidly fatal cardiac infarction than postmen.
Incidence of ischaemic heart disease in bus drivers and bus conductors
Age(years) |
Conductors Incidence rate per 100 men in 5 years |
Drivers Incidence rate per 100 men in 5 years |
40–49 |
1.6 |
7.6 |
50–59 |
5.1 |
9.8 |
60–69 |
7.4 |
7.9 |
Total |
4.7 |
8.5 |
Source: Morris JN, Kagan A, Pattison DC and Gardner MJ. Incidence and prediction of ischaemic heart disease in London busmen. The Lancet 1966; 2(7463): 553–9.
Bed rest
One of the most important clinical developments was simplicity itself. Richard Asher was a physician at the Central Middlesex Hospital who combined clarity of thought, deep understanding of the everyday problems of medicine and sparkling wit. It was he who gave Munchausen’s syndrome its name, after the famous baron who travelled widely and told tales that were both dramatic and untrue. In 1947 he was among the earliest to identify the dangers of institutionalisation and going to bed.20
It is always assumed that the first thing in any illness is to put the patient to bed. Hospital accommodation is always numbered in beds. Illness is measured by the length of time in bed. Doctors are assessed by their bedside manner. Bed is not ordered like a pill or a purge, but is assumed as the basis for all treatment. Yet we should think twice before ordering our patients to bed and realise that beneath the comfort of the blanket there lurks a host of formidable dangers.
Asher pointed to the risks of chest infection, deep vein thrombosis in the legs, bed sores, stiffening of muscles and joints, osteoporosis and, indeed, mental change and demoralisation. He ended with a parody of a well-known hymn:
Teach us to live that we may dread
Unnecessary time in bed.
Get people up and we may save
Our patients from an early grave.
The medical profession, although not immediately convinced, recognised that here was an issue to be explored. Francis Avery Jones, a gastroenterologist at Asher’s hospital, later said that early ambulation saved the health service tens of thousands of beds, and many people their health and lives. Doctors had previously equated close and careful post-operative supervision with keeping people in bed; once they were out of bed there was a danger of premature discharge, and fatal pulmonary embolus might occur. For example, the BMJ said that a surgeon would be in a difficult position if he allowed a patient to be discharged the fourth day after appendicectomy or the seventh day after cholecystectomy (as happened in the USA) and developed a fatal embolus in the second week.21 The probability that the embolus was the result of the closely supervised bed rest was not appreciated.
Surgeons were concerned that incisions would not heal if patients got up too soon, but Farquharson, at Edinburgh Royal Infirmary, wrote that the cause of morbidity and mortality after an operation was usually remote from the actual wound. He believed that there was little evidence that wounds needed bed rest to heal. He proved his point by operating on 485 patients with hernia under local anaesthetic and discharging them home before the anaesthetic had worn off. Only one patient out of 200 needed readmission. The patients liked early discharge, they waited only a few days for operation, and the financial savings were considerable.22
The quality and effectiveness of health care
Doctors seldom looked at their clinical practice and its results. When, around 1952, a paper was put to the JCC that included lengths of stay, one physician loftily said: “all that is needed is that a consultant should feel satisfied that he has done his best for the patient. This arithmetic is irrelevant.” Death was the clearest measure of outcome, and infant and maternal mortality were studied – but comparisons of the results of different types of treatment were rare. On occasion clinicians might seek Ministry support for medical review projects, but it had to be covert and not an attempt to impose a central system. The use of randomised controlled trials now provided a way of validating clinical practice and the effectiveness of treatments. Matching cases by human judgement was open to error; randomisation involving large numbers provided an even dispersion of the personal characteristics likely to affect the outcome. The principles were established by D’Arcy Hart and Austin Bradford Hill. Austin Bradford Hill crashed three aircraft without injury while serving in the first world war but subsequently developed tuberculosis, which barred him from clinical medicine. He read economics, got a grant from the Medical Research Council (MRC), moved to the London School of Hygiene and determined to make a life in preventive medicine. An inspiring writer, many of his ideas passed into common usage; he understood the ethical and clinical problems that doctors faced, and could convince senior members of the profession that they should adopt controlled trials. A friend of Hugh Clegg, Editor of the BMJ from 1947, Hill chose that journal for his publications because of its wide circulation among doctors of all specialties. Clegg wanted good scientific papers and accepted long summaries because many doctors would not be prepared to read the entire papers.23 Hill fed Clegg the MRC’s report on the randomised trial of streptomycin in the treatment of tuberculosis, the trials of cortisone and aspirin in rheumatoid arthritis and the trial of whooping cough vaccine. Though a powerful tool, randomised trials were not always applicable; in surgery, for example, randomisation was not always practicable.
The MRC worked with the Ministry of Health and began to establish clinical research units. The provincial universities developed academic units more rapidly than London; for example, Robert Platt, Professor of Medicine in Manchester, and Henry Cohen, Professor in Liverpool. The medical press and contacts between doctors had always helped the dissemination of new clinical ideas. Now the NHS provided a new mechanism. It was said that those in the Ministry could achieve anything if they did not insist on claiming credit. Many doctors would take up a good idea when it was drawn to their attention, if the approach was tactful. SMAC could be asked to look at specific clinical problems. Regions could then be given guidance that would be adopted throughout the country if it was seen to accord with professional thinking. Once a new idea was spotted, it could be nurtured. Doors could be opened to let people through. Organisations such as the Nuffield Provincial Hospitals Trust, the King’s Fund and the Ministry worked quietly together. Some doctors were natural originators, others born developers, and both could be supported. Those seeing the way ahead would try to get others to follow. Postgraduate education, statistical methods, the use of controlled trials, group general practice and the development of geriatric and mental illness services were all ideas fostered and given a platform.
The drug treatment of disease
Before the second world war, many drugs had no effect, for good or ill. Placebo prescribing was commonplace, with a reliance on the patient’s faith. The first decade of the NHS saw the discovery of a staggering array of new and potent drugs. The drugs that were being developed were expensive and sometimes difficult to produce. Usually they were not immediately released for general use. The tetracyclines and cortisone were not available on GP prescription until 1954/5 when industrial-scale production facilities had been created. Inevitably costs rose. At the end of the 1949 parliamentary session, power was obtained to levy a prescription charge.24 It was not used immediately but was invoked by the next government and used almost continuously and increasingly thereafter.
Penicillin and streptomycin were available when the NHS began but it was not known how they worked. Biochemistry and cell biology had not developed sufficiently for the underlying mechanisms to be understood.25 Syphilis and congenital syphilis were among the diseases conquered. Within the next year, aureomycin, the first of the tetracyclines, was discovered and proved to be active against a far wider range of organisms. The response of chest infections to antibiotics rapidly revealed a group of non-bacterial pneumonias, previously unsuspected, caused by viruses and rickettsial bacteria. Chloramphenicol was isolated from soil samples from Venezuela and soon synthesised; it worked in typhus and typhoid. In 1950 terramycin, another tetracycline, was isolated in the USA from cultures of Streptomyces rimosus. In 1956 a variant of penicillin, penicillin V, became available that could be given by mouth, avoiding the need for painful injections.26
The clinical exploitation of a new antibiotic usually passed through two phases: first, over-enthusiastic and indiscriminate use, followed by a more critical and restrained appraisal. Some strains of an otherwise susceptible organism were, or became, resistant to the drug. An early example was the reduction in efficacy of the sulphonamides in gonorrhoea, pneumonia and streptococcal infections. Penicillin withstood the test of time more successfully, but Staphylococcus aureus slowly escaped its influence and became resistant. Resistance of the tubercle bacillus to streptomycin was quickly acquired, and resistance was also a problem with the tetracyclines.27 Erythromycin was discovered in 1952, resembled penicillin in its action, and by general agreement was reserved for infections with penicillin-resistant bacteria.28 It became policy to use antibiotics carefully and to try to restrict their use.29
Cortisone, demonstrated in 1949 at the Mayo Clinic, did not fulfil all early expectations. It had a dramatic effect on patients with rheumatoid arthritis and acute rheumatic fever, but this was often temporary.30 Supplies were limited because the drug was extracted from ox bile and 40 head of cattle were required for a single day’s treatment. Adrenocorticotropic hormone (ACTH) was even more difficult to obtain, being concentrated from pig pituitaries. Quantities were therefore minute and costs were high, so more economic methods of production were sought. By 1956 prednisone and prednisolone, analogous and more potent drugs, had been synthesised and were in clinical use. Like cortisone they were found to be life-saving in severe asthma. Few effective forms of treatment had been available to dermatologists. Now there were two potent forms of treatments: antibiotics for skin infection and corticosteroids that had a dramatic effect on several types of dermatitis.
The outcome of patients with high blood pressure was well known because there was no effective treatment. Four grades of severity were recognised, based on the changes in the heart, the kidneys, and the blood vessels in the eyes. In severe cases, grades three and four, the five-year mortalities (death within five years of diagnosis) were 40 per cent and 95 per cent. Surgery (lumbar sympathectomy) might prolong survival but in 1949 hexamethonium ‘ganglion-blocking’ drugs were introduced, and the era of effective treatment had begun. At first, drugs had to be given by injection but preparations that could be taken by mouth were soon available. None of the alternatives approached the ideal: surgery was not particularly successful; dietary advice and salt restriction made life miserable; reserpine made patients depressed; and ganglion-blocking drugs had severe side effects, including constipation, fainting and impotence. Only people with the most severe hypertension were therefore considered for treatment.31
Vitamin B12 was synthesised and liver extract was no longer required in the treatment for pernicious anaemia.32 Insulin had been used in the treatment of diabetes since the 1920s but a new group of drugs suitable for mild and stable cases, the oral hypoglycaemic sulphonamide derivatives, were developed. They simplified treatment, particularly in the elderly, and reduced the need for hospital attendance.33 The antihistamines were introduced mainly for the treatment of allergic conditions. They were associated with drowsiness which, in drivers, caused traffic accidents. Reports from the USA that they cured colds were examined by the MRC; the drugs were valueless. The common cold had again come unscathed through a therapeutic attack.34
Chlorpromazine was introduced in 1952 for the treatment of psychiatric illness. It produced a remarkable state of inactivity or indifference in excited or agitated psychotics and was increasingly used by psychiatrists and GPs.35 The tranquillisers, for example, meprobamate, also represented a substantial advance. Barbiturates had been used for 50 years, but they were proving to be true drugs of addiction and were commonly used in suicides.36 The new drugs undoubtedly had a substantial impact on illnesses severe enough to need hospital admission, but whether they helped in the minor neuroses was less certain.37 William Sargant, a psychiatrist at St Thomas’, referred to the extensive advertising and the shoals of circulars through the doctor’s letterbox. Big business was beginning to realise the large profits to be made out of mental health. All that was necessary was to persuade doctors to prescribe for hundreds of thousands of patients each week.38
Halothane, a new anaesthetic agent, was carefully tested before its introduction, although repeated administration in a patient was later shown to be associated with jaundice.39 It was neither inflammable nor explosive. Explosions during ether anaesthesia, often associated with sparks from electrical equipment, occurred and inevitably killed some patients.
For many years there had been concern about adverse reactions to drugs and the best way to recognise them. As the pharmaceutical industry developed an ever-increasing number of new products, anxieties increased.40 The problem came to a head in the USA in 1951, when a few patients were reported in whom chloramphenicol had produced fatal bone marrow failure (aplasia). The American Medical Association appointed a study group to examine all cases of blood disorders suspected of being caused by drugs or other chemicals. The problem was thought to be rare, because chloramphenicol had been widely used, yet it was found that there had in fact been scores of cases of aplastic anaemia and it had taken three years to appreciate the potential toxicity. There was rapid agreement that its use should be limited to conditions untreatable by other means.41
In 1956, Dr John Bodkin Adams was arrested for murder following the death of many of his patients, often elderly ladies who had left him substantial sums in their wills. Between 1946 and 1956, 160 died under suspicious circumstances. He was acquitted but later convicted of false statements on cremation forms and offences under the Dangerous Drugs Act. Opinion remains divided as to whether he was a mass murderer or an early proponent of euthanasia. He was restored to the Medical Register in 1961.
Radiology and diagnostic imaging
Tests and investigations were playing an increasing part in the diagnostic process. Radiology revealed the structural manifestations of disease, but the basic technology had not changed greatly since 1895 when the first films were taken. An X-ray beam produced a film for later examination, or the patient was ‘screened’ and the image was examined directly in a darkened room. The radiation exposure was higher with screening and the radiologist had to become dark-adapted before he could work. From the 1930s radiology developed rapidly, but hospital services were handicapped by a shortage of radiologists.
Three developments gave radiology a new impetus. First, in 1954 Marconi Instruments displayed an image intensifier, which produced a much brighter image, although the field was only five inches (12.7 cm) wide. It was visible in subdued light and good enough to photograph. The technique was immediately applied to studies of swallowing. Secondly there were improvements in contrast media, used to visualise blood vessels. They were often unpleasant and sometimes risky. From the 1950s new ‘non-ionic’ agents were introduced. Cardiac surgery was developing fast and catalysed developments in radiology; for example, angio-cardiography in which contrast medium was injected into the blood vessels leading to the heart before a series of X-rays.42 The third development, in 1953, was the introduction of the Seldinger technique. This made possible percutaneous catheterisation, the introduction of a fine catheter into a blood vessel, thus avoiding the need for an incision. A tracer guide wire could be inserted and imaged, and when in position a catheter slid over it. Contrast medium could be injected selectively into blood vessels, under direct vision using the image intensifier, just where it was required.43
The availability of radioactive isotopes (radio-isotopes) led to the development of nuclear medicine and a new method of imaging. Radio-isotopes could be introduced into the body, sometimes tagged to tissues such as blood cells. As they were chemically identical to the normal forms, they were handled by the body in the same way. It was possible to measure the presence and amount of the radio-isotope, its spatial distribution and its chemical transformation. The new techniques provided a way of studying, at least crudely, some of the body’s functions, as opposed to its structure. Isotopes were chosen to minimise the radiation dose as far as possible. At first, radioactive tracer work was the province of the pathologist, as in studies of blood volume and circulation. The development of gamma cameras and rectilinear scanners, however, meant that images could be produced as well as ‘counts’, and radiologists came to the fore.44
Early in 1955 the MRC, at the request of the Prime Minister, established a committee chaired by Sir Harold Himsworth to report on the medical aspects of nuclear radiation. Its report, a year later, contained the unexpected finding that exposure of the gonads to diagnostic X-rays significantly increased the irradiation received, by some 22 per cent.45 The fall-out from testing nuclear weapons was less than 1 per cent. Shortly after, Dr Alice Stewart published a report suggesting that childhood leukaemia was associated with irradiation of the fetus (and also with virus infection and threatened abortion).46 Her findings were not accepted until a second study from the USA confirmed a connection with irradiation during pregnancy. Although radiologists were already concerned about the dangers of radiation exposure, there was some delay in taking greater precautions during pregnancy.
Infectious disease and immunisation
Deaths in England and Wales from infectious disease
Tuberculosis |
Diphtheria |
Whooping cough |
Measles |
Polio |
|
1943 |
25,649 |
1,371 |
1,114 |
773 |
80 |
1944 |
24,163 |
1,054 |
1,054 |
243 |
109 |
1945 |
23,955 |
722 |
689 |
729 |
139 |
1946 |
22,847 |
472 |
808 |
204 |
128 |
1947 |
23,550 |
244 |
905 |
644 |
707 |
1948 |
23,175 |
156 |
748 |
327 |
241 |
1949 |
19,797 |
84 |
527 |
307 |
657 |
1950 |
15,969 |
49 |
394 |
221 |
755 |
1951 |
13,806 |
33 |
456 |
317 |
217 |
1952 |
10,585 |
32 |
184 |
141 |
275 |
1953 |
9,002 |
23 |
243 |
245 |
320 |
1954 |
7,897 |
8 |
139 |
45 |
112 |
1955 |
6,492 |
12 |
87 |
174 |
241 |
1956 |
5,375 |
3 |
92 |
28 |
114 |
1957 |
4,784 |
4 |
87 |
94 |
226 |
1958 |
4,480 |
8 |
27 |
49 |
154 |
The decade saw the end of smallpox as a regular entry in public health statistics, the decline of diphtheria and enteric fever to around 100 cases per year, the greatest ever epidemic of poliomyelitis, and a substantial rise in food poisoning and dysentery, possibly related to better diagnosis now available through the Public Health Laboratory Service (PHLS). It is hard nowadays to appreciate the misery and deaths caused by infectious diseases, which were common and potentially lethal. In 1948 there were 3,575 cases of diphtheria with 156 deaths. Tuberculosis remained a major problem, although notifications to the Medical Officer of Health (MOH) and deaths were steadily getting fewer. There were 400,000 notifications of measles with 327 deaths, and 148,410 of whooping cough with 748 deaths. The USA had introduced diphtheria immunisation in the 1930s, but it was not until 1940/1 that local authorities, spurred by Wilson Jameson, launched a major campaign in the UK. A long-forgotten clause in a Public Health Act gave local authorities the power to do so. Whooping cough, tetanus and polio immunisation followed. As new vaccines were introduced, each was usually given three times; the schedule for infants became increasingly complex until ‘triple’ vaccines improved matters.
There had been small sporadic outbreaks of poliomyelitis for many years, but the disease assumed epidemic proportions in 1947. Thereafter the numbers fluctuated, but remained at a historically high level for several years with 250 to 750 deaths annually. It was the custom for cases to be admitted to isolation hospitals, and then transferred to orthopaedic hospitals for the convalescent and chronic stages. Oxford established a team including specialists in infectious disease, neurology and orthopaedics so that patients with severe paralysis could be assessed jointly from the start. Respiratory support with ‘iron lungs’ was available and passive movement of the limbs reduced the risks of later deformity. The tide turned when Jonas Salk developed an inactivated vaccine in the USA and reported the success of field trials in 1955.47 Manufacture began in Great Britain under the supervision of the MRC, and immunisation of children started in 1956.
Bacterial food poisoning was an increasing problem. Imported egg products from North and South America and, after the war, from China, sometimes contained Salmonella. Synthetic cream was associated with many outbreaks of paratyphoid fever, and spray-dried skim-milk was responsible for outbreaks of toxin-type food poisoning.
Cases of smallpox occurred intermittently. In 1950 there was an outbreak in Brighton, introduced by a fully vaccinated RAF officer recently returned from India. There were 26 cases, 13 of which were among nursing and medical staff, domestics and laundry workers at the hospital to which the earliest cases were admitted, and ten deaths.48 In 1952, an outbreak in Rochdale led to 135 cases with one death, and there were further importations in succeeding years.
The death rate from tuberculosis had begun to decline after the first world war, but the incidence was still high, and primary infection occurred in nearly half the children before they were 14. When the NHS began, there were 50,000 notifications a year and 23,000 deaths. Before streptomycin, doctors relied on the natural resistance of the patient, aided by bed rest and the indirect effect of ‘collapse’ therapy. To reduce the movement of diseased lung tissue, in the hope that this would assist healing, sections of the rib cage were removed (thoracoplasty), air was introduced to collapse the lung (artificial pneumothorax) or the phrenic nerve would be divided to paralyse the diaphragm. Antibiotics attacked the tubercle bacillus directly. There was insufficient streptomycin to treat everyone who might benefit, and supplies went to those in whom the best results could be expected, young adults with early disease. A rigorously controlled investigation run by D’Arcy Hart and the MRC confirmed the effectiveness of streptomycin. In a second trial, the newly discovered para-aminosalicylic acid (PAS) was proved to prevent the development of bacterial resistance, and a third trial examined the level of dose required.49 In 1952 isoniazid was introduced. Given alone it was no better than streptomycin and PAS, and patients could rapidly develop drug resistance. However MRC trials and the work of Professor Sir John Crofton (1912–2009) in Edinburgh showed that it was not which drugs were given that mattered, but in what combination and for how long. As success could only be assessed by the absence of a relapse in subsequent years, it took time to establish the best options. Triple-drug therapy over 18 months to two years greatly reduced the problem of the emergence of resistant strains of tubercle bacilli, but some clinicians were slow to adopt the protocols that gave such excellent results. The results were so good that collapse therapy and surgical methods of treatment were used far less frequently.50 An MRC trial in India showed that, even under the worst social conditions, patients rapidly ceased to be infectious if they took their treatment. There was no need to admit patients for long periods to reduce the risk of infection to families and the community. For the first time, early treatment of tuberculosis had major benefits, yet there was an average delay of four months between the first consultation and a diagnostic X-ray; GPs were urged to refer patients more rapidly.51
In the drive for early treatment, disused infectious disease wards were used, a good example of the new opportunities open to the NHS. In 1948 the waiting list figures had convinced the Manchester Regional Hospital Board that a new sanatorium was urgently required. By 1953 it had not been built, but it was now no longer needed because the waiting time for admission had fallen from nine months to a few weeks.52 Within a few years, beds for tuberculosis and the fevers were being turned over to newly developing specialist units, for example, neurosurgery. After a successful trial of the tuberculosis vaccine BCG (bacillus Calmette-Guérin) by the MRC, immunisation at the age of 13 was introduced, reducing further the number of new infections. Mass mobile radiography units were important tools in ‘case-finding’. The vans would visit centres such as colleges and hospitals where there were many young people, and 35mm pictures were taken of images produced by fluorescent screening.
There was a major influenza outbreak in 1951/2. From 5 to 8 December 1952 ‘smog’ (fog filled with smoke) of unusual density and persistence covered the Greater London area. People piled coal onto their fires to keep warm. To most, smog was no more than an inconvenience. Those with chronic heart and lung disease were less lucky. Their illnesses got worse and many died. Dying people, their lips blue from lack of oxygen were forced to walk to the hospital because ambulances stopped running. For some years an ‘emergency bed service’ had operated in London, finding beds for emergency admissions by phoning round the hospitals. It came under pressure and immediately restricted non-urgent admissions, but the media were first to spot the severity of the problem. Florists ran out of flowers for funerals. Newspaper articles drew attention to the death of prize cattle at the Smithfield show. Not until the death certificates had been assembled was the full severity of the episode apparent; there were 3,500–4,000 excess deaths.53 St George’s (Hyde Park Corner), like all London hospitals, admitted many victims of bronchitis and heart failure; as it was not possible to see from one end of a ward to the other, they were divided in two so that patients could be properly observed. A committee chaired by Sir Hugh Beaver was set up in July 1953, which rapidly identified the importance of pollution from solid fuels. Its recommendations, (which smoke abatement groups had been suggesting for almost a century), formed the basis of a single comprehensive Clean Air Act on 5 July 1956. Emission control was required; industry had to change and methods of manufacturing had to alter. It became an offence to emit dark smoke from a chimney, and local authorities could establish smoke control areas. Following the legislation, the age-specific death rates of men in Greater London fell by almost half. The opposition to the control of atmospheric pollution, for example, from industry, was slight. This was not the case with smoking; although its hazards were far greater, there were issues of individual choice and liberty, and much more antagonism from industry.
Rheumatic fever, associated with streptococcal throat infection, was another common disease of childhood normally requiring admission to hospital. More frequent among the poor, there would be fever, pain and stiffness in the larger joints. Although some children might die of the acute illness (700 in 1949, falling to 174 in 1957), the main problem was that about half developed rheumatic disease of heart valves, which became incompetent (they leaked) or stenosed (they obstructed blood flow). The result was progressive heart failure in adolescence or later in adult life.
Milder infections were not ignored. At Salisbury, the Common Cold Research Unit had been established before the war to examine this difficult problem. Volunteers turned up every fortnight to help the scientific work. By 1950 they numbered more than 2,000, including 253 married couples, several being on their honeymoon.54
The incidence of venereal disease had increased in both world wars. After 1945 the level began to fall and many venereologists thought seriously of leaving what seemed to be a dying specialty. Venereal disease responded to antibiotics: syphilis was rapidly cured, and cases of congenital syphilis fell steadily as antenatal testing became routine, followed by treatment where necessary. The reduction in gonorrhoea, however, levelled off and drug-resistant strains became apparent. By 1955 the levels were rising again, and they continued to do so. Dr Charles, the CMO, said that sexual promiscuity was as rife as it had ever been in times of peace, and while this was the case, the venereal peril would be ever with us.55
The PHLS expanded as ‘associated laboratories’ were incorporated into the main network. Increasingly the laboratories were located on the site of acute hospitals and came to provide bacteriological services to the hospital as well as to the local authorities responsible to assist the control of infectious disease. The PHLS was becoming involved both in the care of individuals and in the health of ‘the herd’. From the early days the PHLS wanted to recruit epidemiologists, but this was opposed by the Ministry and the medical officers of health (MOsH). From 1954 its weekly summary of laboratory reports contained hospital as well as community data, and became a comprehensive account of the prevalence of infection. The PHLS was also deeply involved in the study of hospital-acquired staphylococcal infection, for patients in surgical wards were increasingly infected by resistant strains. First detected in 1954, the problem spread rapidly and led to the appointment, in most hospitals, of infection-control nurses. The management of the service was reviewed in 1951 and the MRC was asked to continue to run it.
Orthopaedics and trauma
War has always produced medical innovations. The Korean War (1950–54) saw the introduction of helicopter evacuation, which in turn led to a reappraisal of the early treatment of injury. Within days of the Air Force choppers’ arrival, the US Eighth Army’s surgeon general asked for their help in evacuating critically wounded soldiers from the front. Thereafter, when they were not flying search-and-rescue missions, they pitched in to get the wounded to hospitals. In the first month alone, the Air Rescue choppers evacuated 83 critically wounded soldiers, half of whom, the Eighth Army surgeon general said, would have died without the airlift. The system was soon formalised and the infantry came to see that, if not killed outright, their chance of survival was now good. During their first 12 months of operation in 1951, Army helicopters carried 5,040 wounded. By mid-1953 Army choppers evacuated 1,273 casualties in a single month. “Costly, experimental and cranky, the helicopter could be justified only on the grounds that those it carried, almost to a man, would have died without it,” an Army historian concluded. It was many years before the lessons learned were applied to civilian trauma care.
Image: Barbara Hepworth painted a series of 60 images of surgeons and nurses, circa 1947.
The 1939–1945 war had given orthopaedic surgery impetus. During the latter part of the war, orthopaedic surgeons began to encounter, among prisoners of war repatriated from Germany, fractures treated by inserting a nail throughout the length of the marrow cavity. The method, originally described by Küntscher, was soon seen to be a success, making possible a shorter hospital stay.56 British surgeons, for example Sir Reginald Watson-Jones, were also developing and using internal fixation for fractures of the femoral neck. In 1949 Robert Danis of Brussels described a system of rigid internal fixation that allowed anatomically accurate reduction, compressing the fracture surfaces. This made it easier to get patients up and moving. Because of early rehabilitation, complications of treatment were reduced and there were far fewer bed sores and deaths from thrombosis and pulmonary embolism.57 At first the plates and screws used were copied from those familiar in joinery; later they were redesigned for the specific needs of fracture surgery. As understanding of fracture healing improved, there was growing recognition that stable fixation of a fracture had immense benefits in terms of restoring the soft tissues for which the bone serves as a scaffold. In addition to the techniques of internal fixation, putting strong inert screws into the fragments of bone and holding them with a light but rigid external fixation system made it possible to correct major damage to soft tissue, vessels and nerves.
The other major pressure on orthopaedic departments was osteoarthritis. Osteoarthritis of the hip was a common and painful condition. Several operations had been devised that relieved pain at the cost of mobility, for example, arthrodesis that fused the femur to the pelvis. Among the more successful was Smith-Petersen’s procedure, involving the reshaping of the joint surfaces and the insertion of a smooth-surfaced cup of inert metal between the moving parts. Re-operation was sometimes required. Arthroplasty, the total replacement of the joint by an artificial socket and femoral head made to fit each other, gave patients a new and mechanical joint. The procedure was first carried out by Kenneth McKee in Norwich around 1950, using cobalt-chrome components.58 No great attention was paid to the surface finish or fit, and the method of fixation proved inadequate. Friction in the joint was high and there were both failures and successes. Some of his patients were seen by John Charnley at a meeting of the British Orthopaedic Association, who considered that the procedure might be improved. The Manchester RHB funded him to develop a new unit near Wigan to refine it. The engineering problems were substantial and the results, to begin with, were not always predictable.
In 1952, 112 passengers were killed and 200 were seriously injured in a three-train collision at Harrow. There was chaos. By modern standards, the fire and ambulance services were hopelessly inadequately equipped, and were untrained to keep trapped people alive. All that could be done was a little bandaging and to take people to hospital as fast as possible. Edgware General Hospital learned of the crash when a commandeered furniture van arrived with walking wounded. Among those responding to the disaster were US teams from nearby bases, who were trained in battlefield medicine. They were disciplined, brought plasma and undertook triage – sorting casualties into those needing urgent attention, those who could wait, and those who were beyond help. It was a new experience for the rescue services; they were amazed and full of admiration.59 Yet the lessons were not learned for many years. In December 1957, another train crash occurred in thick fog near Lewisham. The ambulances moved 223 people, and 88 died in the accident. The SAMO, James Fairley, called for reports. As at Harrow, there were failures in communication, difficulty in identifying senior staff at the site, inadequate supplies of dressings and morphine, a shortage of ambulance transport, and difficulties in creating records and documenting the injured.60
Major trauma was also increasing on the roads as traffic was becoming denser. By 1954 there were more than one million motorcycles on the road, and over 1,000 deaths among their riders. Crash helmets were seldom worn and the neurosurgical units picked up the problems.61 Roughly 50,000 people required admission for head injury annually, and three-quarters of road fatalities were the result of this. The few neurosurgical units whose primary concern had been with tumours were increasingly asked to care for patients with head injury. More units were opened, improving accessibility. Walpole Lewin, in Oxford, argued for regional planning in close association with a major accident service.62 Research work at the Birmingham Accident Hospital improved the treatment of injury immeasurably. It was widely recognised that severe collapse after major injury was associated with a vast fall in blood volume, far greater than could be accounted for by external loss. Where had the blood gone, and what should the treatment be? Blood volume studies after accidents made it clear that huge amounts of blood were lost from the circulation into the swelling around fractures. Major burns led to a similar depletion of circulating blood volume. Rapid and large blood transfusion saved lives. Lecturing to the St John’s Ambulance Brigade, Ruscoe Clarke appealed for the re-writing of first-aid textbooks. The hot cup of tea and a delay while patients got over the shock of injury had to go; time was not on the patient’s side and recovery would only begin after transfusion and surgery.63 He provided the Association with new text for its handbooks.
Cardiology and cardiac surgery
In the 1940s the only methods available for the diagnosis of heart disease, other than bedside examination, were simple chest X-rays and the three-lead electrocardiograph. The effective drugs were morphia, digitalis and quinidine.64 The management of heart disease was about to change out of all recognition. It was a subject that attracted the cream of the profession; Paul Wood at the National Heart Hospital was only one among a number of clinicians who educated a new generation of doctors about valvular, ischaemic and congenital heart disease, taught new ways of listening to the heart and interpreting what was heard, and opened new pathways in treatment.65 Were he to have a heart attack, Wood did not wish to be resuscitated. When he did, some years later, he was not.
Infective disease of the heart had been a major problem but the effectiveness of antibiotics in streptococcal infections, which might otherwise have been followed by acute rheumatism, began to change its incidence. Syphilitic heart disease with aortic incompetence (valve leakage) was yielding to arsenicals, heart damage as a result of diphtheria to immunisation, and infection of heart valves following rheumatic fever to antibiotics.66
There was little effective treatment for coronary artery disease, an increasing problem. Coronary arteries might slowly become narrowed, and a heart attack (myocardial infarction) would occur if arteries suddenly became blocked. Losing its blood supply, heart muscle would be damaged, abnormal rhythms might develop, the patient might suffer great pain and death often occurred rapidly. In 1954, Richard Doll and Bradford Hill reported that there was a high incidence of coronary disease among doctors who smoked, a finding supported a few months later by the American Cancer Society. Its vice-president said that the problems raised by the effects of smoking on the heart and arteries were even more pressing than the more publicised linkage of smoking and lung cancer.67 An association with high fat consumption was also suggested, for populations with the highest consumption also seemed to have the highest death rate from coronary heart disease. The greater incidence in the better-off countries could, however, be due to other factors such as a low level of physical exercise and other features of high standards of living.68
It being an axiom in medicine to rest damaged structures, prolonged immobility was traditional for people with heart attacks. A few specialists, however, suggested that the abrupt and grave nature of the disease, when coupled with long-continued bed rest, devastated the morale of people who had previously been active and healthy. ‘Armchair’ treatment was introduced without any apparent problems.69 Anticoagulation by heparin had been used for deep vein thrombosis since the 1930s, and their value in treating life-threatening pulmonary embolus was beyond dispute. Heparin could be given only by intravenous injection but a family of coumarin derivatives that could be taken by mouth was developed in the 1940s. Control was difficult, and regular estimates had to be made of the ‘clotting time’. In heart attacks, the evidence of their value was weaker, largely based on a trial in New York in which patients were treated or not according to the day of the week on which they were admitted. Although there was less evidence of effectiveness, a vogue developed for their use.70 Cardiac arrest, the ultimate danger in a heart attack, was sometimes treated successfully with a new piece of equipment – the external cardiac defibrillator.71 Cardiac surgical development was an example of how progress in clinical medicine is the result of developments by many workers in many fields. These included cardiac catheterisation, new methods of measurement, studies on the coagulation of blood, hypothermia, perfusion techniques (the heart-lung machine), pacemaking, the use of plastics, new design of instruments and studies of immune reactions.72 It was the development by Magill and Macintosh of endotracheal anaesthesia (in which a mask was replaced by a cuffed tube inserted into the trachea) that made surgery inside the chest practicable. Cardiac catheterisation was devised in Germany in the 1930s but was not commonplace until the 1950s when it became the tool used to explore the right side of the heart, to measure atrial, ventricular and pulmonary artery blood pressures and to take blood samples. Combined with arterial blood sampling, it became possible to determine the nature of heart valve damage, for example, after rheumatic fever. This permitted good case selection and carefully planned heart surgery. Twenty-four-hour electrocardiography was introduced in the USA by Norman Holter, improving the diagnosis of abnormality of heart rhythm.
Progress in England centred on Guy’s, the National Heart Hospital, Leeds and the Hammersmith, and was led by people such as Russell Brock at Guy’s, Cleland at the Hammersmith, and Thomas Holmes Sellors at the Middlesex. The heart operations undertaken before 1948 had included surgery to repair congenital defects that could be undertaken rapidly without stopping the heart or opening it, for example, operation for patent ductus arteriosus (in which a connection between the aorta and the pulmonary artery remains open after birth). ‘Blue babies’ with congenital heart disease would seldom outlive their teens without surgery.73 Brock operated on some, but several of his earliest cases died. The coroner was alarmed and Brock had to explain the risks of surgery and the way the children selected for operation were already near the point of death. Unless surgeons could develop the necessary operative techniques, all such patients were doomed. Wartime experience with the treatment of bullet wounds of the heart had given surgeons courage to challenge the long-held belief that operating on the heart was dangerous. It was commonly believed that rheumatic heart disease was a disorder of heart muscle and not primarily due to valve damage. Some surgeons, however, believed that valve damage was the crucial lesion; in 1948 three surgeons, Dwight Harken and Charles Bailey in the USA, and Brock at Guy’s, independently performed successful mitral valvotomy for mitral stenosis, widening the opening of valves that had become partially fused and were restricting blood flow. Brock attempted three operations within a fortnight. The surgeons were entering unknown territory and their work proved that the problem of chronic rheumatic heart disease was primarily mechanical. Brock’s work was followed by Thomas Holmes Sellors at the Middlesex in 1951.74 There was a backlog of seriously sick people in or approaching heart failure. The first operations had a high mortality, seven in the first 20 of Brock’s series. This rapidly improved to about 5 per cent for mitral valvotomy, and more difficult lesions such as pulmonary stenosis were tackled.75 Many of the patients were young men and women doomed to an early death without surgery. Sometimes the type of repair needed was beyond the techniques available. Yet risky though the attempts were, particularly on pulmonary and aortic valves, there was often no alternative.
The introduction of hypothermia in the early 1950s was the next advance. It was found that, at a body temperature of 30°C, the heart could be stopped for ten minutes. The commonest method was immersion in a bath of cold water. It proved possible to repair some atrial septal defects (openings in the division between the two atria) and make an open direct-vision approach to the pulmonary and aortic valves. Hypothermia could also be used in the resection of aortic aneurysms (absence of the aortic valve opening).76 Perfusion came next. The technique of producing temporary cardiac arrest using potassium was worked out by Melrose, a physiologist at the Hammersmith Hospital. Heart-lung machines were developed by the Kirklin unit at the Mayo Clinic in the USA, and in England by Melrose and Cleland at the Hammersmith. There was much to be learned; Kirklin reported six deaths in his first ten cases, and a further six in the next 27. But, by the time he had reached 200 cases, deaths from the procedure were rare.77 British cardiac surgeons deliberately held back and waited to see what the outcome of Kirklin’s work would be. When he had developed reliable procedures, three British units at the Hammersmith, Guy’s and Leeds began work. All were well equipped, well staffed and expertly run departments. A pattern was set; cardiac surgery became established in regional centres, usually in association with a university teaching hospital. Only near such surgical facilities could advanced cardiology develop effectively.
Cardiac arrest was not necessary for operations on large blood vessels such as the aorta. Coarctation of the aorta, in which the vessel became narrowed, and aortic aneurysms also became manageable surgically.78 After the introduction of angiography, in which solutions that were opaque to X-rays were injected into blood vessels, the frequency of atheromatous obstruction of the internal carotid artery was realised. Angiography was an uncomfortable and sometimes hazardous investigation. Urged on by George Pickering, Rob and Eastcott performed the first carotid endarterectomy at St Mary’s Hospital in 1954, on a woman with transient episodes of hemiplegia and difficulty with speech. Although an increasing number of patients were treated, it remained a risky operation.79
Renal replacement therapy
Life-threatening kidney disease might be either acute or chronic. Acute renal failure, from the crush injuries of the blitz, a mismatched blood transfusion or a prolonged low blood pressure from blood loss, might get better if the patient could be kept alive long enough. If great care was taken with fluid intake and diet, some survived. In 1943, it was shown by Kolff in Holland that patients with terminal renal failure could be kept alive by artificial haemodialysis. Few were thought to be suitable for this, and it was mainly used for those in acute renal failure from which spontaneous recovery was to be expected. It was not offered to patients who had an irreversible condition from nephritis associated with streptococcal infection, diabetes or high blood pressure.80 Indeed it was thought unethical to offer dialysis to those with chronic disease, as it would only delay an inevitable and unpleasant death. However, in 1954 a successful renal transplant was undertaken in the USA. The patient, who had chronic renal failure and would otherwise have died, received a kidney from an identical twin. While only one in 100 would have the chance of a sibling’s kidney that the body’s immune system would not reject, asking everyone with chronic renal disease whether he or she was a twin was now important.
Neurology and neurosurgery
The great developments in descriptive neurology and neurosurgery largely preceded the NHS, under the influence of North American surgeons such as Harvey Cushing and Wilder Penfield, and British neurologists such as Francis Walshe. The central nervous system, once damaged, did not regenerate, neither could it be repaired surgically. The specialty centred on the accuracy of diagnosis. Seldom was there any treatment available; only three out of 100 papers published in Brain held out any hope for the patient. Shortly before the NHS started, the RCP committee on neurology, seeing a need to develop the specialty outside London, recommended the development of active neurological centres in all medical teaching centres, in which neurology, neurosurgery and psychiatry should work together.81 At least one such centre, in Newcastle, equalled anything in the south. There, Henry Miller was followed by John Walton and David Shaw. Miller, who was interested in immunological disease, pointed out the advantages of the neurologist working in a hospital providing district services, who would see local epidemics, deal with people who were at an early stage of their disease and were often acutely ill, and be in close contact with other physicians.82 Miller, and Ritchie Russell in Oxford who was interested in poliomyelitis, began to reorientate neurology and link it more closely to general medicine. Attitudes began to change, with concentration on the prevention of damage in the first place, altering the biochemistry of the nervous system, and on rehabilitation. Developments elsewhere in medicine, in clinical pharmacology, imaging and later genetics, drove neurology and neurosurgery, which advanced steadily as specialties rather than experiencing sudden and major developments.
In the 1950s, neurosurgery dealt with head injuries, brain tumours, prefrontal leucotomy for mental illness, destruction of the pituitary for advanced cancer of the breast and precise surgery deep in the brain for Parkinson’s disease (stereotaxic surgery). New diagnostic investigations, in particular cerebral arteriography, helped it. Seeing the circulation of the brain was possible by taking a series of radiographs in rapid succession after the injection of contrast medium. Cerebral tumours and intracranial haemorrhage, cerebral aneurysms and cerebral thrombosis were all revealed, making diagnosis more accurate and operation more successful.83
Ear, nose and throat (ENT) surgery
Three main developments – antibiotics, better anaesthesia and the introduction of the operating microscope – underpinned advances. Until the introduction of antibiotics, the main function of the ENT surgeon was to save life by treating infection, acute or acute-on-chronic, affecting the middle and inner ear, the mastoids and the sinuses. Untreated infection could spread inside the skull, leading to meningitis and brain abscesses. By 1950 such catastrophic diseases were rare. The work of ENT surgeons altered substantially and those mastoid operations still being carried out were usually for long-standing disease.84
Zeiss produced the first operating microscope specifically for otology in 1953, revolutionising ENT surgery. Surgeons began to turn their attention to the preservation of hearing, the loss of which they had previously accepted as inevitable. Chronic infection of the middle ear prevented the movement of three minute bones that transmitted sound. Some operations that were now popularised had been attempted 50 years previously, but without magnified vision and modern instruments and drills they had been abandoned. Though simple in conception, the operations demanded scrupulously careful technique and great patience.85 Among the first to become widespread was an operation for otosclerosis, to free-up certain small bones in the middle ear (mobilisation of the stapes), or to remove them (stapedectomy). Tympanoplasty (repairing damage to the middle ear) was described in Germany in 1953. Under the influence of surgeons such as Gordon Smyth of Belfast, the procedure was rapidly introduced into the UK.
The commonest ENT operation, indeed the commonest operation, was ‘tonsils and adenoids’ (Ts and As). Surgeons seemed most convinced of the benefits, whereas the MRC regarded the procedure as a prophylactic ritual carried out for no particular reason and with no particular result. John Fry, a Beckenham GP, in a careful analysis of his patients, concluded that although nearly 200,000 operations were carried out annually, the number could be reduced by at least two-thirds without serious consequences. Operation was usually carried out for recurrent respiratory infections, problems that tended to natural cure at around the age of seven or eight. The operative rates seemed to depend entirely on local medical opinion. A child in Enfield was 20 times as likely to have an operation as one in nearby Hornsey; the children of the well-to-do were most at risk of operation.86 From the mid-1940s there was dramatic growth in the incidence, or recognition, of ‘glue ear’ in children, a condition that made them deaf. Thick gluey mucus remained in the middle ear, usually after upper respiratory tract infections. It was uncertain whether this was related to the widespread use of antibiotics, but an operation for inserting a grommet through the eardrum after removing the mucus by suction succeeded Ts and As as the commonest operation worldwide.
In the non-surgical field, the MRC had designed a hearing aid shortly before the NHS began, the Medresco aid. It was developed by the Post Office Engineering Research Station at Dollis Hill, assembled by a number of radio manufacturers instead of the hearing aid industry, and issued free of charge on the recommendation of a consultant otologist. The market was a large one, but the Medresco aid, though cheap, was behind the times. It consisted of a body-worn receiver connected to an ear-piece. Transistors, incorporated into commercial aids from 1953, were not used in the aids issued free by the NHS until several years later.
Ophthalmology
The availability of free spectacles under the NHS revealed a huge and pent-up demand from the public, largely satisfied by opticians under the supplementary ophthalmic services. Ophthalmologists seldom saved lives but their ability to maintain function by preserving sight ensured the specialty’s place in every district hospital. The specialty was a pioneering one, lending itself to technical innovation, but it had a low priority in many undergraduate courses, although postgraduate education at hospitals such as Moorfields was world renowned. Many diseases, for example, high blood pressure, diabetes and some genetic conditions, involved the eye. Ophthalmology collaborated effectively with many specialties in sharing diagnostic advances such as ultrasound and, later, scanning. Operating microscopes were becoming available. Transplant surgery was being pioneered by ophthalmologists as corneal grafting. The treatment of cataract involved the removal of the now opaque lens, an early example of microsurgery, and the supply of powerful glasses. Operation was postponed until a late stage of visual loss. In 1949 Harold Ridley, working at St Thomas’, treated a Spitfire pilot with a piece of Perspex from the cockpit canopy embedded in his eye. The plastic seemed well tolerated and it was suggested that a plastic lens might also be accepted. A surgeon of great skill, he pioneered the implantation of a lens into the eye, and had many successes, although others were not able to achieve his results for some years. His achievement was celebrated by the issue of a Medical Breakthrough stamp in 2010. Detachment of the retina, a largely untreatable disease, was managed by prolonged bed rest until the photocoagulator was introduced around 1950.
Cancer
The treatment of cancer involved surgery if a cure was thought possible. If the disease was past the point at which surgery could help, radiotherapy was used as palliation. Although surgery was the foundation of treatment in common cancers such as that of the lung, many patients were inoperable when they first presented, and the five-year survival was low.87 Surgeons became increasingly radical in an attempt to eliminate tumours. Few people were told their diagnosis; only the relatives were informed. The phrase ‘cancer chemotherapy’ was largely incomprehensible and the claim that malignant disease could be controlled or even cured by drugs was more appropriate to the charlatan than the physician. The physician’s place was to administer the medical equivalent of extreme unction – opiates and comfortable words. Radium or kilovolt irradiation could, in fact, produce worthwhile remissions and some long-lasting cures, but radiotherapy was seldom seen as curative. Radium was replaced as post-war developments in atomic energy made artificial isotopes available. Gamma-emitting sources such as cobalt-60 provided a vastly more powerful source and were first used to treat patients in 1951. This made it possible to deliver a high dose internally without massive skin damage. By 1955 there were 150 telecobalt machines worldwide; six years later there were over 1,000. Linear accelerators, a by-product of wartime research on radar, were also introduced. The NHS ordered four to be installed in major units, the Hammersmith getting the first in 1953. ‘Super-voltage’ machines became an intrinsic part of the equipment of radiotherapy departments, and radiotherapy was progressively organised as an integrated regional service, with just one such department in a given region.88 The introduction of radio-isotopes was the great hope for the future, because of the possibility that they would be concentrated selectively in tumours. Only rarely did they prove an advance.
The modern era of leukaemia therapy began in the 1940s with the work of Sidney Farber, then pathologist at the Children’s Hospital, Boston. Farber had the idea of disrupting the growth of malignant cells with antimetabolites. The years 1940–1950 saw the discovery of several drugs, later useful in curing cancer. Nitrogen mustard had been used since 1942 and produced striking, although temporary, regression of the tumours. The next useful drug to be discovered came from the knowledge that folic acid deficiency was associated with bone marrow inhibition. Metabolic antagonists to folic acid, such as aminopterin, were shown to produce temporary remissions in childhood leukaemia.89 Corticosteroids were also shown to have anti-tumour properties, both in experimental animals and in humans. Mercaptopurine was the result of biochemical reasoning that nucleic acid metabolism might be altered. By the 1950s many drug development programmes were under way in the USA, industry was becoming interested and clinical trials were starting. Although medicine remained largely impotent in the face of disseminated cancer, the BMJ optimistically but correctly said that the foundation of a logical approach to the problem had been laid and an efficient machinery for the selection and testing of remedies devised.90 A new diagnostic tool for cancer was emerging: exfoliative cytology, looking for malignant cells on mucous surfaces and in body secretions. Before the war, Professor Dudgeon at St Thomas’ routinely used cytology in the diagnosis of cancer of the lung and cervical cancer. King George VI’s cancer of the lung was diagnosed there by sputum cytology. Papanicolaou’s work in 1943 placed this development on an increasingly firm basis and it was developed progressively during the first ten years of the NHS, placing an extra burden on pathology departments.
Smoking and cancer
As the impact of infectious diseases lessened, the importance of cancer increased. Mass radiography, introduced in the years of war to detect tuberculosis, increasingly revealed carcinoma of the bronchus, although it was ineffective as a screening measure. In the first ten years, 10 million examinations were carried out and 2,000 cases of intrathoracic cancer were found, 90 per cent of them in men.
Unlike malignancy as a whole, cancer of the respiratory system had shown a steady rise since the early 1920s. Many thought this was due to better diagnosis, or that a fall in the number of cases of tuberculosis had thrown cancer of the lung into greater prominence, or that sulphonamides had allowed people to survive pneumonia long enough to develop the signs of cancer. Studies, some in Germany during the second world war, had associated heavy smoking with lung cancer.91 Percy Stocks, at the General Register Office, thought that atmospheric pollution might be involved and wrote to the MRC in 1947 to say that further investigation was warranted. With typical common sense, Bradford Hill brushed aside the suggestion of air pollution; husbands and wives experienced similar exposures but smoking men got cancer while their non-smoking wives did not.
An MRC conference concluded that it would be unwise to assume that all the rise was an artefact and Bradford Hill was asked to carry out a study, which he did with the help of Richard Doll. The two research workers asked hospitals to notify the admission of patients with possible cancer of the lung, stomach and large bowel; they took their smoking histories and followed them up after discharge. Practically none of those with cancer of the lung were lifelong non-smokers; the rise was a real one and not merely due to better diagnosis. The findings, the result of interviewing 649 men and 60 women with carcinoma of the lung, were presented to Harold Himsworth at the MRC in 1949. Himsworth thought it crucial to ensure, before publication, that the results were right and asked for further hospitals outside London to be included in the study, which was extended to Leeds, Newcastle, Bristol and Cambridge. Published in 1950, shortly after an American case-control study by Wynder and Graham, Doll and Bradford Hill claimed a causal connection between smoking and lung cancer. At ages 45–74 years, the risk was 50 times greater among those smoking 25 cigarettes a day or more than among non-smokers.92 The iconic paper was probably that of Doll and Bradford Hill on the mortality of doctors in relation to their smoking habits. The BMJ said that the practical question which doctors in practice had to answer was whether any patients, for instance those with a smoker’s cough, should be advised to give up smoking.93
Many doctors, unaccustomed to controlled studies, remained unconvinced, so Doll and Bradford Hill launched one of the earliest prospective studies. It involved 40,000 doctors, a group that was studied for the next 40 years.94 They published an extension to their case-control enquiry in 1952. The BMJ said that the probability of a causative connection was now so great that one was bound to take what preventive action one could. The younger generation would have to decide, each for himself or herself, whether the additional risk of smoking was worth taking.95 The Standing Advisory Committee on Cancer and Radiotherapy, chaired by Sir Ernest Rock Carling, himself a lifelong heavy smoker, gave no advice on which the Ministry could act. Meeting twice in the first half of 1952, it advised the Minister that the statistical evidence strongly suggested that there was an association between smoking and cancer of the lung, but this evidence was insufficient to justify propaganda. The Committee thought, in any case, that it would be undesirable for central government to be involved in cancer education, but that it should be left to local authorities and voluntary bodies.96 The government got no help on which to act, even if it had been minded to. Richard Doll published further material in 1953, and the following year Bradford Hill and Doll published the preliminary results of the prospective study that succeeded in changing attitudes.97 Largely for financial reasons, the government was not keen to give publicity to the increasingly certain connection between smoking and cancer.98 A subsequently established panel advised the Minister that it must be regarded as established that there was a relationship between smoking and cancer of the lung, and that it was desirable that young people should be warned of the risks apparently attendant on excessive smoking. On 12 February 1954, the Minister made a statement in the House.99
No urgent action was felt necessary. In 1956, the Cabinet considered the issue. In response to the Health Minister, Robert Turton, who suggested warning the public, Macmillan said that this was a “very serious issue. Revenue was equivalent to 3/6d on income tax: not easy to see how to replace it.” He added: “Expectation of life is 73 for smoker and 74 for non-smoker. Treasury think revenue interest outweighs this. Negligible compared with risk of crossing a street.” The government resolved to wait until later in the year, when another medical report was due.
The death of George VI, a heavy smoker who suffered from arterial disease in the legs, coronary artery disease and cancer of the lung, was not associated in the public mind with tobacco.100 Its addictive properties were hardly recognised, and it was thought that, if the risk was made clear, people would respond. The tobacco industry spent enormous sums on promotion and the Ministry sat back, baffled. Sir John Charles, the CMO, was not a man to stick his neck out. He talked of the “mysterious and inexorable rise in cases”. In his reports he said that the convinced individual could largely avoid exposure to tobacco smoke if he so wished. The Ministry asked the MRC if it would undertake further research into the relationship of smoking and cancer and was told that, as the answer was known, it would be a waste of money. Asked for a formal opinion on the relationship in 1957, the MRC published its response in the professional journals: the increase in lung cancer was attributable to the increase in cigarette smoking.
Obstetrics and gynaecology
Pre-war, the high maternal mortality rate had been of great concern. The chance of a mother dying from her pregnancy or associated causes in 1928 was one in 226. Janet Campbell, a Ministry doctor, had devised a pattern of regular antenatal supervision for the poor in London’s East End. However, antenatal care remained patchy, many mothers did not use the services and GPs had played only a minor part. By 1948 the maternal mortality rate was falling, although there was no evidence that this was the result of antenatal supervision.101 The perinatal mortality rate (stillbirths and the number of infant deaths in the first week of life per 1,000 births) had also fallen, but appeared to have levelled out at 3.85 per cent in 1948.
Since the 1930s there had been a gradual increase in babies being delivered in hospital. Cross-infection in maternity hospitals had been a constant danger, but antibiotics had reduced this risk. The Royal College of Obstetricians and Gynaecologists (RCOG) in 1944 had advocated that 70 per cent of deliveries should be in hospital, and ten years later raised its target to 100 per cent. During the 1950s, the percentage of births taking place in hospital remained fairly static at around 65 per cent, but then it started to rise.102 Public opinion was drifting to the view that hospital was best, and mothers increasingly chose it as safest for themselves and their babies. It was free under the NHS, home-helps were in short supply and the home might be unsuitable or overcrowded. The birth rate was rising and the demand for beds outstripped supply, in spite of which maternity beds were sometimes turned over to acute cases and tuberculosis. The Ministry thought it difficult to justify the provision of beds for normal cases “simply because the mother prefers to have her baby in hospital”.103 It was said that, to get a bed in hospital, you had to book three months before you were pregnant; hospitals had a monthly quota and it was first come, first booked.104 District midwives delivered many at home who could not be fitted into the hospital, even when hospital delivery was indicated. Lack of pain relief was the main complaint women had of the maternity services. Midwives were not permitted to give pethidine until 1951. Only 20 per cent of women delivered at home received any form of pain relief, usually as gas/air, and only half of those in hospital. Because women had to stay in hospital for 14 days, antenatal patients who needed admission, perhaps because they had high blood pressure and toxaemia that posed a hazard to mother and baby, could not be admitted because the obstetric beds were full of mothers, most of whom were fighting fit and desperate to go home. In the mid-1950s Geoffrey Theobald, at St Luke’s Hospital in Bradford, realised that women could be discharged home safely after 48 hours, provided the district midwives kept an eye on them. The ‘Bradford experiment’ meant more antenatal beds.105
The question of home or hospital delivery became contentious, although there was no sound information on which was safer, nor a clear view on the cases that should be booked for hospital. To begin with, the accent was placed on housing and social problems. Later it shifted towards obstetric risks, the mother’s age and the number of children she had already had. In 1954 Professor WCW Nixon, of University College Hospital, arranged a meeting of experts to discuss the possibility of obtaining data on the relative risks of hospital and home confinement. Out of this grew the perinatal mortality survey of the National Birthday Trust Fund, a charity working to improve the health of mothers and their babies. Not all GPs were up to date; some were unconvinced about the need for systematic antenatal care. The RCOG had stressed that GPs undertaking midwifery should have special experience, and supported the midwives who undertook regular postgraduate training. GPs saw fewer cases, particularly of operative obstetrics. The average GP had 30– 40 deliveries a year, including those that went to hospital or were handled by midwives. Was this enough to maintain skills? Some GPs felt threatened and did not want to co-operate in a consultant-led service. Sometimes midwives respected the GPs with whom they came into contact; often they did not.
The maternal mortality was lowest in areas in which there was unified organisation of maternity services. In pioneer areas, midwives, GPs and consultants organised themselves in partnership, in the interests of the GP and imperative for the mother and child.106 In Hertfordshire a system of shared care was adopted in which, after hospital booking, the GP undertook antenatal care until the 36th week. In Bristol, a similar system operated, based on a health centre, GPs working alongside midwives and hospital staff in managing pregnancy.107 In Oxford good relations were established between the RHB, the obstetric departments and the GPs with their local maternity units. The confidential enquiry into maternal deaths, the first serious British attempt to scrutinise the outcome of care, was introduced in 1952 following an international conference on obstetrics in London and discussion between the President of the RCOG, and George Godber at the Ministry. The first report was published in 1957.108 It followed a smaller pre-war study in Britain and a classic study of maternal mortality in New York City in 1933. The enquiry was voluntary and confidential, for only if the reports were treated as privileged, and never disclosed to anyone other than the professional staff handling them, could frankness be expected. The registration of a death related to pregnancy was the starting point. Information was obtained from the GP and local obstetrician, the report then going to a regional assessor, a senior obstetrician appointed after consultation with the President of the RCOG. Consultant advisers to the Ministry of Health in obstetrics, anaesthetics and pathology made a final assessment. Avoidable factors were found too often to allow the opportunity to improve matters to pass. Reduction of deaths due to toxaemia and haemorrhage was important. The survey highlighted a danger appreciated since the 1930s, when the first obstetric ‘flying squad’ was established in Newcastle. Women with a retained placenta after home delivery were often put into ambulances and sent to hospital without either transfusion or manual removal of the placenta, only to be found moribund on arrival.
Obstetrics was developing increasingly fast. In 1955 Ian Donald, in Glasgow, used ultrasound for the first time to examine an unborn baby. It became the preferred technique for monitoring the progress of pregnancy, replacing radiology, which had been shown by Alice Stewart in Oxford to put babies at risk. Theobald introduced a new method of inducing and increasing the strength of labour, the oxytocin drip, and there were advances in reducing postpartum haemorrhage and the delivery of the placenta (the third stage of labour). In a few units (e.g. University College Hospital, London) there was interest in the mother as a person; husbands were allowed and even encouraged to be with their wives during labour, a policy viewed in most hospitals as eccentric.
Paediatrics
Paediatrics, as a specialty, was weak in 1948 and there was little systematic training. Unlike the situation in North America, GPs provided much paediatric care. The problem of infectious disease seemed likely to be solved by the antibiotics. Specialists in diseases of the heart, the lungs and the joints cared for many children, and few of those in a children’s ward were under the care of a paediatrician. If born in hospital, the baby was in the care of the obstetrician, and relationships with paediatricians might be prickly. Many diseases of children were becoming less common, for example, rheumatic fever and tuberculous meningitis.
In the years preceding the second world war, special units for premature babies had been created in some places; for example, Mary Crosse’s department in the grounds of the Sorrento Maternity Hospital in Birmingham. Crosse and her nurses would go out in taxis with hot water bottles to bring in small and premature babies. Victoria Smallpeace in Oxford was another pioneer. Retinopathy of prematurity, producing blindness in premature infants in the first few weeks of life, was described in Boston in 1942. In the late 1940s and early 1950s the number of cases in the UK surged. For ten years little more was known about the condition than an association with low birth weight and premature baby units; the cause remained a mystery. Mary Crosse, in a flash of intuitional brilliance suggested that it might be due to the use of high concentrations of oxygen in incubators.109 She found no case in Birmingham before 1946, but, out of the first 14, 12 had been on continuous oxygen for between two and five weeks. It was the additional money that came with the NHS, she said, that enabled centres of expertise to buy incubators and the expensive oxygen required. There had been a well-intentioned but misguided change in care; indeed, nurses and doctors might object to the suggestion that oxygen for sick babies should be restricted. In 1951, the MRC started sifting the records of maternity units and ophthalmic units. This confirmed the connection and showed that a high concentration over a period of several days was dangerous. Oxygen levels were reduced and the incidence of the disease fell greatly.110 Retinopathy of prematurity was not the only disease caused by medical treatment; increasing interest in the neonate was accompanied by the rapid use of new drugs such as chloramphenicol and the sulphonamides, and, as immature babies did not metabolise these like adults, overdosage might occur.
The emotional problems of sick children in hospital were not understood in 1948. Children’s wards might only allow parental visits for an hour on Saturday and Sunday, and would discourage telephone enquiries. Children admitted to hospital were usually placed in adult wards and few staff felt it necessary to explain their treatment to them. Simple things, such as moving them from one bed to another in the ward, or the use of red blankets, could create anxiety. Some children would react by withdrawing into themselves, others by seeking friendship and reassurance from everyone. However, paediatricians, such as Sir James Spence in Newcastle and Alan Moncrieff at Great Ormond Street, drew attention to the great distress caused by the ‘no visiting’ policy.111 Particularly if in hospital for a considerable period, the infant forgot the mother and clung to the nurse when the time for discharge came, to the distress of all three. Nursing staff sometimes became possessive about children. An experiment in daily visiting was tried; the mothers liked it and the nurses preferred the closer contact with the family. John Bowlby, Director of the child and family department at the Tavistock Clinic, published a book on maternal and child health care in 1951. This drew attention to the devastating effect of separation from the mother. It was followed in 1953 by a film, ‘A two year old goes to hospital’, that showed the traumatic and long-term effects on the young child suddenly separated from the mother and placed in strange surroundings. Daily visiting, seldom permitted previously, was progressively introduced. A second film in 1958, ‘Going to hospital with mother’, made it clear that the presence of the mother should be the norm, not the exception. Nurses should change their role from mother-substitute to adviser and friend, giving the mother understanding and the child skilled nursing.112
In 1956 Caffey, a radiologist at the Columbia University and Babies Hospital, New York, speaking to the British Institute of Radiology, described a group of children suffering from trauma. Paediatricians, he said, faced with unexplained pain and swelling in the limbs, usually embarked on an elaborate search for vitamin deficiencies and metabolic diseases. Simple trauma was given short shrift by those bent on solving the mysteries of more exotic diseases. Correct diagnosis of injury might, however, be the only way in which abused youngsters could be removed from their traumatic environment. Once the ‘battered baby syndrome’ was recognised, many cases came to light, usually in children under the age of two who had suffered repeated injuries, often ascribed to ‘falling downstairs’ but in reality caused by their parents. They might have brain injury, fractures of the limbs and ribs, multiple bruises and injuries. Other children who ‘failed to thrive’ had been persistently underfed or emotionally neglected. Often the families in which the cases arose already had many other problems. There was widespread media interest, and health visitors, GPs and casualty departments now had something else to look for.113
A major cause of babies dying during labour or in the week after birth (the perinatal mortality) was haemolytic disease of the newborn. The condition had been defined and its cause worked out in the USA by Darrow, Levine and Weiner in the 1940s. Six out of every 1,000 babies suffered from it, as a result of incompatibility between a rhesus-positive baby and a rhesus-negative mother who had developed antibodies during a previous pregnancy. Fifteen per cent of the babies affected were stillborn, and some of the others were rescued only by replacing the baby’s blood by an emergency exchange transfusion. By the early 1950s, mothers developing antibodies to the rhesus factor were admitted to units with special facilities. Exchange transfusions and early induction of labour produced some improvement in the mortality for rhesus-positive fetuses, but there was no way to reduce the numbers of rhesus-negative mothers who became sensitised during pregnancy and delivery.114
Geriatrics
The wartime hospital surveys had shown that care and accommodation for the ‘chronic sick’ were often inadequate, but the size of the problem made it hard to solve. Care had largely been custodial, with little more than minimal attention from few staff, either in the back wards of hospitals or else in units separated from acute services. During the war years, Marjory Warren, at the West Middlesex County Hospital, found herself looking after the chronic sick wards. She argued that geriatrics should be treated as a special branch of medicine, staffed by those specially interested in the subject. With greater effort, more patients could be discharged. The elderly should be cared for within the curtilage of district hospitals where special investigations and rehabilitation were available. A change in the attitude of the profession was called for, and the care of the chronic sick should be an important part of medical and nursing education.115 Other pioneers included Lionel Cosin, who established the first day hospital in Oxford, and Tom Wilson, the first consultant geriatrician appointed in Cornwall in 1948. They got excellent results and a more intensive use of their beds by treating acutely ill old people vigorously in short-stay wards, taking medical, social and psychological problems into account. The achievement of independence depended on a high standard of medicine, good teamwork and an atmosphere of optimism and activity, combined with the patient’s confidence and co-operation.116 Marjory Warren, speaking in 1950, said people could be treated in their own homes if there was co-ordination of GPs and consultants, domiciliary consultations with the geriatric team, and home help and district nurses. ‘Keep them in bed and keep them quiet’ was replaced by ‘get them up and keep them interested’.117 The Nottingham City Hospital established a geriatric unit in 1949, fully staffed with physiotherapy, occupational therapy, chiropody and links with psychiatry and the local authorities.118 University College Hospital was the first teaching hospital to establish a unit, under Lord Amulree, at St Pancras Hospital. Nursing faced a major new demand, preventing patients from joining the ranks of the bed-fast, stiff, incontinent and dull of mind.119 There was also a preventive aspect; in Salford a health visiting service was developed for the elderly. The co-ordination of domiciliary services, physiotherapy, chiropody, laundry and bathing attendants could prevent admission to hospital. Health visitors could remedy gaps in the service and deal with the needs of families as a whole.120
The Ministry guidance on specialist services had suggested that general physicians would give an increasing amount of time to the chronic sick. They did not. In 1954 the Ministry organised a national survey of the services for the elderly, collated by Boucher, the senior medical officer concerned.121 Some waiting lists were so long that GPs had stopped referring patients. An administrator might determine the priority of admission, sometimes swayed by the importunity of the family doctor. Waiting lists were seldom reviewed and were grossly inaccurate. Accommodation could be in long, rambling, draughty buildings far from other hospital services. An outside cast-iron staircase served one ward on the first floor over a boiler-house and a paint store. Bed turnover might be as low as 1.4 per year in some regions. In one group of 447 beds, the physician ‘did not believe in geriatrics’. A nearby colleague with 417 beds was mainly interested in paediatrics and, unable to raise enthusiasm for the elderly, had not visited them for months. In contrast were units that had adopted a more active approach, assessing patients before admission and campaigning for physiotherapy and occupational therapy services. Cosin at Oxford, and Olbrich at Sunderland, had annual bed turnovers of 3.6 and 5.6. Active treatment, rehabilitation and discharge were coupled with readmission if patients were unable to maintain independence, even with domiciliary services. Where the consultant’s primary interest was in elderly people, the service benefited incomparably. If he had other interests, the elderly always took second place to acute patients.
The Ministry could now press for the development of geriatrics as a specialty. A pool of doctors who had trained in general medicine were looking for posts, and geriatrics provided them with opportunities. The Advisory Committee on Consultant Establishments helped them, and more than 60 geriatric units were soon established with the more modern philosophy, although most of the new consultants found themselves working in poor accommodation. A new group of specialists had emerged: physicians interested in treating the elderly and not merely looking after them. Their first task was to deal with the vast number of patients they had inherited, introducing active management, cutting the number of beds they needed and reducing the waiting lists. They introduced domiciliary assessment and outpatient care for people waiting for a bed, many of whom had social rather than health problems. If properly used, there was probably no shortage of beds.
Mental illness
Many developments in psychiatric practice took place in the RHB hospitals, often in the provinces, and were largely divorced from the growing points in acute medicine, the teaching hospitals and universities.122 The psychiatric departments of the teaching hospitals, where they existed, were not part of the mainstream, saw few psychiatric emergencies and undertook a different type of work. Their interests mostly lay in psychological medicine and psycho-neuroses. They drew a different group of patients, often of higher social status, who hoped for greater courtesy and personal attention than was usual in the general hospitals, and that outpatient care rather than admission would follow expert and thorough assessment.123 Professorial units existed only in Leeds, at the Maudsley, and later in Manchester. Before the second world war, most psychiatric patients had been ‘certified’, although the Mental Treatment Act 1930 had enabled the admission of voluntary patients and the establishment of outpatient clinics. By the early 1950s, two-thirds of patients were voluntary and not under a compulsory order. As people became more willing to be admitted to a mental hospital, increasing numbers led to overcrowding. Yet services were far from comprehensive, and were poor or non-existent for the elderly who were mentally infirm, for mentally ill offenders, and for adolescents. Drug addiction was hardly recognised as a problem, neither was attempted suicide that was occurring more and more often.
Physical methods of treatment had long been used, virtually always for schizophrenia. Convulsion therapy used chemicals to induce fits, but was abandoned as patients could remember the entire episode in frightening detail. Electroconvulsive treatment (ECT), introduced in 1938, produced amnesia, and was given without anaesthesia. Six strong nurses held the patient down, but the strength of the muscle contractions frequently produced injuries, particularly crush fractures of the vertebrae. The introduction of anaesthesia and muscle relaxants overcame many of its evils and ECT clinics treated a dozen or more patients in a session. Deep insulin therapy, also introduced for schizophrenia, was at times hazardous and occasionally fatal. It was progressively questioned and ceased to be used in the late 1950s. Prefrontal leucotomy was at first regarded as a major advance in therapy but proved to be damaging to the patient’s personality. Its use for schizophrenia ceased, and more limited operations came into vogue. Psychiatrists awoke from a wish-fulfilling dream that they had been unwittingly party to a game of ‘Emperor’s new clothes’.124 Only ECT, for depression rather than schizophrenia, proved of lasting use. Just as physical methods were being given up, drugs appeared. In 1952 chlorpromazine (Largactil) was introduced. Psychiatric practice was already undergoing major change. Henry Rollin, a psychiatrist at Horton Hospital, Epsom, and the anonymous author of a number of BMJ editorials, wrote that such powerful drugs heralded the onset of a revolution in the treatment of schizophrenia. If admission to a hospital was necessary, the stay could now be measured in weeks rather than months.125 People with a recent onset of illness had a higher likelihood of early recovery and began to be accommodated separately from long-term patients, difficult in hospitals of traditional design where the buildings were arranged for security rather than comfort and resocialisation.126
By 1948 some mental hospitals had opened their doors. Their doctors believed that most, if not all, patients could be persuaded to co-operate, and that locked doors, at any rate in day time, should be as obsolete as chains. An attempt was being made to improve the characteristics of the institutions by the introduction of occupational therapy, music and art classes. The Lancet published an account of hospitals where this policy worked well.127 TP Rees at Warlingham Park, Croydon, had kept all but two of the 23 wards open for 12 years, and Macmillan in Nottingham, with 1,100 patients, had all the wards open day and night. Patients did not abscond and they were unlikely to wander off if they had a congenial task. Depressed patients might be at risk of suicide if not treated with ECT on the day of admission but the system was better for both staff and patients. Nurses preferred not being warders, and tensions in the wards were fewer. Overcrowding led to a need to expand outpatient treatment and to Joshua Carse’s ‘Worthing experiment’, based on Graylingwell Hospital, Chichester.128 The regional board and the Nuffield Provincial Hospitals Trust sponsored a two-year trial of intensive outpatient and domiciliary treatment that was rapidly seen to work, reducing admissions.129 Psychotherapy was simpler on an outpatient basis, and as the antidepressants were introduced, more people could be treated without admission. With better anaesthesia, outpatient ECT was also possible. The next years saw a rapid emancipation from the restricted and isolated world of the old mental hospitals. Sometimes, new ideas, such as Maxwell Jones’ ‘therapeutic community’, were adopted.
The Ministry of Health saw a comprehensive service, integrating hospital and community resources, as a way of reducing overcrowding. The CMO’s report said:
The most successful form of rehabilitation has been a combination of habit-training and full occupation. A start is usually made in the ward containing the worst type of patient, the noisy, violent and destructive, and those with degraded habits. Such patients are split into groups of about ten, each group having one or more specially selected nurses in charge of it. The group is drilled into an unvarying routine with special emphasis on personal hygiene, cleanliness and neatness in dress, which need not imply any harshness since many of these patients appear to be quite indifferent to what goes on, and come after a time to respond mechanically. Full and suitable occupation is provided for the group under the supervision of its ‘permanent’ nurse. It is most desirable that patients being trained in this way should live in comfortably furnished rooms and that recreations should be provided. It is essential that they should have and retain their own personal clothing and underclothing. Few are the patients who fail to respond to such a regime. It is found that their wards become quiet and peaceful, the use of sedative drugs almost or entirely ceases, and locked wards can be opened.130
A committee in 1956 made recommendations about the rehabilitation of the mentally ill before discharge.131 However, continued support after discharge was not readily to be found. The Lancet said that aftercare probably had a more important place in the treatment of the mentally ill than any other type of problem, but psychiatric social workers were scarce and aftercare had almost ceased to exist.132
The isolation of the mental hospitals and their staff from the public and the wider health professions was well known.133 Pioneering work on local services was undertaken in the northwest, where there were many small towns, such as Burnley, Blackburn and Oldham, of an independent turn of mind. Their town-centre municipal hospitals usually incorporated chronic and mental illness wards. From the outset, the Manchester RHB’s first chairman, Sir John Stopford, wanted to improve services and by 1950 the RHB policy when appointing psychiatrists was to base them centrally to avoid the divorce of mental illness from the broad stream of general medicine. A planned and coherent system was developed with outpatient assessment and early treatment in general hospital units. These were 100– 200 beds in size, each with its own catchment and admitting all patients. To everyone’s surprise they seldom needed to send patients to the few, large and distant asylums. Perhaps this was because the units were small and the patients got individual attention, they were in the centre of the community, patients didn’t lose contact with friends and went into town to local cinemas and football matches. Most psychiatrists felt that co-operation with an integrated geriatric unit was essential as the work overlapped; and that, with 100 beds, they could deal with a population of 250,000.134 Another major development was the requirement, set out in the Goodenough Report, that medical schools should have an active department of psychiatry.135 Increasingly, such departments were developed, took on catchment areas, and often led the way in developing new treatments for the mentally ill, within general hospitals.
Yet those visiting the old asylums, such as Members of Parliament, might be dismayed at what they found. The chairman of the mental hospitals’ committee of the Birmingham RHB agreed with the complaints of MPs. Beds were so close that they had to be moved to enable nurses to deal with troublesome patients. Ward temperatures might fall to 20°C in the winter. The weekly cost of care was £4 6s 7d against £13 10s 10d in a general hospital and £22 9s 3d in a teaching hospital. “Give us an extra 5 shillings per patient”, said the chairman, “and we will achieve miracles”. Lack of staff meant that the patient/staff ratio nationally was 6.6 to 1 in mental illness and 7.0 to 1 in mental handicap hospitals. In Lancashire, some nurses banned overtime above the normal 48-hour week to call attention to the problem. As action spread, voluntary admissions had to be restricted.136
The Percy Commission
. . . that the law should be altered so that whenever possible suitable care may be provided for mentally disordered patients with no more restriction of liberty or legal formality than is applied to people who need care because of other types of illness, disability or social difficulty. (para. 7)137
Its membership included the President of the RCP and Dr TP Rees. Joint evidence from the Ministry of Health and the Board of Control provided a set of clear-cut proposals.138 The Commission reported in May 1957, recommending the repeal of all existing legislation and a single new law covering all forms of mental disorder.139 Running through the report were two simple ideas: first, that all distinctions – legal, administrative and social – between mental illness and physical illness should, as far as possible, be eliminated; and, second, that patients who did not need inpatient care should, wherever possible and desirable, receive treatment while remaining in the community. Compulsory powers of admission should be used less frequently. The assumption should be that mental patients, like others, were content to enter hospital unless they or their relatives positively objected. Its recommendations formed the basis of the Mental Health Act (1959), and was the foundation of a move towards community care.
The mental hospitals had shared the medical superintendent system of the municipal hospitals and the superintendent was often autocratic. Even consultants might be ‘on parade’ in his office. Psychiatry, as a discipline, had problems. Its changing world was unfamiliar. Psychiatrists were few and, although there were leaders, the quality as a whole was questionable. The Ministry wanted to introduce state-enrolled assistant nurses, but the nursing unions and the Mental Health Standing Advisory Committee argued that fully trained staff or student nurses should exclusively undertake mental health nursing. In the event, most nursing fell to untrained personnel. The competition of other employers made poor recruitment worse; conditions in the army were better than in the mental hospitals. The wastage among students, who had a different pattern of training from those in the general hospitals, was high: 80 per cent compared with 40 per cent in general nursing.
General practice and primary health care
The NHS Act 1946 provided a family doctor to the entire population. The Bill emphasised that health centres were to be a main feature.140 At public cost, premises would be equipped and staffed for medical and dental services, health promotion, local health authority clinics, and sometimes for specialist outpatient sessions. The programme was aborted before it even started.
Whereas Bevan had persuaded consultants into the service in part by merit awards, the GPs had been unwilling to join until virtually the last moment. The public, however, were encouraged to sign on with those doctors willing to enter the scheme, leaving others with the choice of joining as well or losing their practices. Within a month, 90 per cent of the population had signed up with a GP. Twenty thousand GPs joined the scheme as they saw private practice disappear before their eyes.141 The NHS Act made it illegal to sell ‘goodwill’; instead a fund was established that compensated GPs when they retired, but it was not inflation-linked. The GPs’ contract for a 24-hour service, the nature of the complaints procedures and even the patients’ NHS cards were virtually unchanged (and still are). GPs, fearing that they might be no more than officials in a state service, argued successfully for a contract for services rather than a contract of service. As a result, they remained independent practitioners, self-employed and organising their own professional lives. The Spens reports determined pay, which was entirely by capitation.142 GPs’ income depended on the number of their patients; even their expenses were averaged and included in the payment-per-patient. Their independence thus assured, GPs were taxed as though they were self-employed, yet, unlike most people in small businesses, they could not set their fees. With a few exceptions, such as payment for a medical certificate for private purposes, no money could pass between patient and doctor. This system, combined with a shortage of doctors, provided no financial incentive to improve services, but neither was there any incentive to over-treat patients.
Variations in list sizes between 1948 and 1989
1948 |
1989 |
|||||
Locality |
Population |
GPs |
Average lists |
Population |
GPs |
Average lists |
Harrogate |
47,311 |
35 |
1,351 |
89,542 |
52 |
1,706 |
Wakefield |
63,274 |
18 |
3,513 |
109,967 |
60 |
1,833 |
Leeds |
421,193 |
173 |
2,438 |
482,936 |
262 |
1,840 |
Bradford |
289,699 |
111 |
2,610 |
319,681 |
162 |
1,973 |
England and Wales |
41,500,000 |
16864 |
2,461 |
52,868,542 |
26,009 |
2,033 |
Source: John Ball143
Single-handed practitioners |
7,459 |
All practitioners in partnership |
9,745 |
As members of partnerships of 2 doctors |
5,732 |
of 3 doctors |
2,577 |
of 4 doctors |
980 |
of 5 doctors |
315 |
of 6 or more |
161 |
Source: Cohen Committee (1954). Figures 1 July 1952
The 18,000 GPs, almost entirely male and half of them single-handed, practised mainly from their own homes. Their distribution was uneven, although not so bad as that of the specialists because there was less dependence on private practice. Before the NHS began, a few GPs had made an excellent living, but many were poorly paid and some had to employ debt collectors. The NHS gave them security and a higher average income. Because they were paid by a flat capitation fee, those in the industrial areas who had large lists of 4,000 suddenly became affluent but had difficulty serving their patients properly. Proud and wealthy GPs in rural or rich suburban residential areas with many private patients, but with small lists, became far worse off.144 The Medical Practices Committee established a system that defined areas as over-doctored, under-doctored or intermediate, and barred over-doctored areas to new entrants.145 Half the population lived in under-doctored areas where the average list size exceeded 2,500, designated as in need of more GPs. Here it was possible for any doctor to set up practice, putting up a plate and waiting for patients to come. Eckstein wrote that places such as Harrogate were gorged with GPs while working-class areas nearby in cities such as Wakefield, Leeds and Bradford were comparatively starved.146 Swindon had average lists of 4,219 in 1948, while the list sizes in Bournemouth averaged 1,334. In 1989, the figures were 2,079 and 1,831. John Ball, later Chairman of the Medical Practices Committee (and of the GMSC), said that there was always a pressure for distribution to revert to under- and over-provision, and control was needed to ensure equality of access in the long term.147
The NHS brought fewer changes than the GPs had feared. Patients, uncertain of their rights, came with questions. Many older people, lacking spectacles for years, rushed to have their eyes tested, and for some months the service was over-stretched. Much untreated illness was brought to light, particularly in women who had suffered for years from chronic conditions such as prolapse. There appeared to be a rise in the workload. The consultation rates of women and children, who had previously been uninsured, were higher.148 No longer would people, to the cost of the doctor and the medicine, say that they would be all right once the worst of an illness was over. Perhaps some work now coming to GPs was trivial; there was a belief born of years of rationing that ‘a line from the doctor’ would work wonders with the housing department. Paperwork changed; bills were no longer necessary, but there were forms for eye tests, sickness, milk and coal. Under the Lloyd George national health insurance scheme, GPs had received medical record envelopes in which they had to keep a note of consultations ‘in such a form as the Minister determined’. Wisely, ministers never defined how this should be. Now the entire registered population had an NHS envelope, transferred from one GP to another when they moved. It came to contain not only the GP’s notes but also hospital letters, so potentially everyone now had a single medical record from birth to death.
A patients’ guide, produced by the Ministry in 1948, said that, as everyone could now have a GP, it was the GP who would:
arrange for the patient every kind of specialist care he is himself unable to give. Except in emergency, hospitals and specialists would not normally accept a patient for advice or treatment unless he has been sent by his family doctor.149
The referral system had previously been an ideal to which doctors aspired, but were not bound if it were against their financial interest. Now the NHS established GP referral as almost invariable practice, imposing at least a partial barrier for patients seeking hospital care. The decision to go to hospital was transferred from patient to GP, reducing patient freedom and increasing the cost-effectiveness of the system. The ‘gatekeeper system’ institutionalised the separation of primary and secondary care. Family doctors defended it because they had continuing responsibility for individual patients; consultants because it protected them from cases that might be trivial or outside their field of interest; and government because it saved money to have a filter system.150 Relationships between GP and specialist had been altered. Previously specialists had made their money from private practice and many patients came on referral. Once the NHS was established, there was no shortage of NHS patients and few consultants made a substantial income from private practice. All were at least partly salaried and most ceased to have any financial reason to be grateful to GPs.
One of the first quantitative accounts of the work of a GP was presented by a young doctor who had recently entered general practice in Beckenham – John Fry.151 He analysed attendances in 1951 by age and sex, noting the reasons for the consultation. Respiratory infections, digestive diseases, neuroses, skin disorders and cardiovascular problems headed the list. The GP dealt with minor ill-health and those major diseases that did not require admission to hospital. Three-quarters of his patients came to see him during the year. Philip Hopkins, in 1951– 53, studied the impact of general practice on the hospital service. He presented data for a practice of roughly 1,500 patients with a consultation rate of 3.3 per year. In three years the practice had referred 860 patients on a total of 1,225 occasions. Of the referrals, 54 per cent were for treatment, often of a nature already clear to the GP. Because direct access to laboratories and X-rays was denied by the local hospital, many were referred solely for a test. Often referrals were to exclude serious illness before a label of psychoneurosis was attached. Only in 183 cases was it for a consultant’s opinion on diagnosis or further management.152 GPs were increasingly interested in practice organisation. Keith Hodgkin reported on the introduction of a radio-telephone into his practice. It enabled him to obtain an ambulance without delay, to continue his rounds while waiting for a delivery, and to get hold of a partner if an anaesthetic was required. The problems were cost and the inadvertent reception of his messages on TV sets, so Hodgkin had to watch what he said.153
In 1948 there was little information about general practice; by 1952 more was available. There were 17,204 GPs in England and Wales providing unrestricted services, plus 1,689 permanent assistants and another 309 trainees. The number was increasing only slowly. A little over half were in partnership. In rural or semi-urban areas a third of GPs were single-handed, a third in partnerships of two, and a third in larger partnerships. The main surgery would be in a small town or other convenient focal point. In urban areas, most of the doctors were single-handed and there were few large groups. The largest lists were found in the industrial Midlands, the northeast coast, south Yorkshire and Lancashire. Even in partnerships, the GPs might see little of each other. The arrangement was largely financial, though it was easier to cover the doctors’ time off. More rarely, as in Skipton, effective group practices were developing in which the partners aimed to work together from the same premises, supporting each other, using a common medical record system and sharing supporting staff.
New entrants to general practice were supplicants; they would be expected to work long hours, reach equality of pay with their seniors in possibly seven years, accept the hierarchical system of the practice, generally behave themselves, and probably do most of the practice obstetrics.154 The Spens Report on GPs’ remuneration suggested that 10 per cent of GPs should be selected, because of their success in practice and suitability, to take on a trainee. The senior GP would be able to manage considerably more patients, make his services more widely available and increase his income. The scheme was later developed to provide vocational training, but that was not its original purpose.155 In 1950 a committee, chaired by Sir Henry Cohen, reported that the status and prestige of the GP should be the equal of colleagues in any and every specialty, and that no higher ability, industry or zeal was required for the adequate pursuit of any of them. Cohen considered that, as general practice was a special branch of medicine requiring supervised training, there should be three years’ preparation, one in practice (any principal having the right to train a new entrant) and two in supervised hospital posts. GPs should continue their education and reading throughout their professional life.156 In 1957, the GMSC of the BMA circulated guidance to achieve greater uniformity in trainer selection and to eliminate abuses.
The cost of prescribing by GPs often exceeded substantially their own pay. The introduction of a free and comprehensive health service had coincided with the discovery and large-scale production of valuable expensive drugs. But why should the number of prescriptions rise when more effective drugs should have returned people to health and work in a shorter time? In 1955 the prescription pricing offices began to send GPs analyses of their prescribing costs compared with the average for the area in which they practised, the beginning of a continuous attempt to constrain the growth in cost of pharmaceuticals. In 1957, the Minister established a committee under Sir Henry Hinchcliffe to investigate the cost of prescriptions. Its interim report said that no evidence of widespread and irresponsible extravagance was found.
Morale
The morale of GPs was low. GPs grumbled and there was little constructive discussion about how matters could be improved.
Something has gone wrong in general practice today. We treat the same people and similar complaints, and many of us have been doing the job for many a long year, and it is puzzling to say what has happened to bring about the change, for change there is. The doctor is irritable with the patients and they are noticing it and commenting on it. The patients are more aggravating and the doctor is noticing it. GPs had been promised more help, an easier life and no bad debts. He had got much more work, in some cases less income as private practice slumped, no bad debts, no help at all, a lot of personal frustration, had lost his soul when he lost the right to sell his practice, and felt that he no longer ran his practice – it was run for him. The patients had a hospital service which, save in an emergency, they could only use by appointment after a wait of several weeks; and a free GP service rushed to the point of indecency. His haemorrhoids had to bleed for six months before he could be treated; her heavy periods for nine months before she could get a hysterectomy. And having been in hospital the patient could be home two weeks before the GP got a report.157
As consultant services improved, GPs were losing access to hospital beds and some felt that this made it difficult to improve standards and status.158 Teaching hospitals gave little priority either to undergraduate or postgraduate teaching for general practice. Both GPs and consultants saw the hospital as the fount of knowledge and GPs felt isolated. They felt embittered and frustrated, had lost their old enthusiasm and succumbed to the line of least resistance.159
Theodore Fox, Editor of The Lancet published a leader saying:
Admittedly general practice in this country was deteriorating long before the NHS was introduced, and its further deterioration is due rather to a heavier load than to any legislative alterations in the Act. But on balance the effects of the Act on such practice have so far been for the worse and there is little evidence that its problems are being squarely faced. Of the two possible policies, the first is to say general practice is so often unsatisfactory that the correct course is to compensate for its defects – to develop hospital and specialist services in such a way that the short-comings of GPs become relatively unimportant. This, we cannot help thinking, is the policy that is, consciously or unconsciously, being followed. The alternative is to make a big positive effort to raise the level and prestige of general practice. This can still be done.160
The Collings Report
In 1944 the Nuffield Provincial Hospital Trust’s ‘Domesday Books’ had examined the hospital service and found it wanting. In 1948 the Trust funded Dr Joseph Collings, who had had experience in New Zealand and Canada as a GP, and who was interested in social medicine, to look at general practice.161 Nuffield records are silent on why he was selected but he probably had introductions from Wilson Jameson (the CMO) and others on the social medicine network. Collings surveyed 55 English practices, all outside London. His report, published in The Lancet on 25 March 1950, raised an issue that was to dog general practice over the years – the wide and unacceptable variation in standards.
Collings spent between one and four days with each GP, seeing industrial (16), urban-residential (17) and rural practices (22).162 He was probably looking for the things he wanted to find. He went on to an academic post in the USA at the Harvard School of Public Health, and his critique was saleable journalism, just what the USA then wanted to hear. The Nuffield trustees invited Theodore Fox, of The Lancet, to edit the report and left the question of publication to the discretion of their chairman. The chairman of the trustees decided that it should be published by The Lancet and not the Trust. Fox was non-partisan, an instrument neither of government nor of the medical profession, but a detached critic of excess on either side. He did not want to promote Collings’ view of general practice, but he was fair and would not suppress it.
Collings had expected variations in quality but not how great they were. In city practices the conditions were so bad that he neither saw effective practice nor believed it was possible. He described surgeries without examination couches, where such records as there were lay loose round the room or in boxes, consulting rooms with a chair for the doctor but not for the patient, and couches where boxes and bottles had rested so long that they had stuck to the surface. Symptoms clearly demanding examination or referral were often passed over. Snap diagnosis and outdated medical knowledge were commonplace. Anything approaching a general or complete examination was out of the question under the prevailing conditions. In rural practice the surgeries were more pleasant, although often lacking basic equipment. The country doctor not only spent more time with his patients but also knew them better. Many GPs were good clinicians, good technicians and fine humanists; certainly not all. Urban-residential practice fell between the two; conditions for the patients were better than the industrial surgeries for “the patient with more cultivated taste expects attention to the niceties”. Taken as a whole, the detailed 30-page report was a damning indictment. Collings wrote that there were no objective standards for practice and no recognised criteria by which standards might be established. ‘We can all make mistakes’ was certainly true in general practice, but the individual mistake paled into insignificance beside the predisposing factors which made serious mistakes not only possible, but in some circumstances highly probable. The reputation of general practice, Collings said, had been maintained through identification with an ideal picture that would no longer stand up to examination. General practice was poorest in proximity to large hospital centres and improved in scope and quality as one moved away. The worst practice was found where the need was greatest, in areas of dense population. Some premises required condemnation in the public interest. Yet Collings remained an enthusiast for general practice. Instead of building up hospital services, he felt the aim should be to see how they could be dispensed with. That meant teamworking of doctors, nurses, social workers and technicians in good premises, which might be based on group practice units perhaps serving 15,000– 25,000 people. The widening schism between hospital and practice, the lack of local authority interest, and the failure of administrative co-ordination, in his view, did nothing to help.
The BMJ, provided with a pre-publication draft, disputed Collings’ findings. The journal rightly thought that his 55 practices did not truly reflect the whole of general practice and certainly Collings had been selective. The BMJ thought that the report would at least do one good thing – focus the spotlight on general practice, which should be the most attractive career in medicine. The NHS was weighted heavily in favour of the hospital and the specialist. Most of the letters to the BMJ disputed Collings’ findings or excused the shortcomings. A minority saw that the report might be the turning point in the NHS and it was up to GPs to take a lead in establishing an integrated service based on general practice.163 Collings entered the demonology of general practice, but stirred others into activity. His three further articles in 1953 were largely ignored.164 In them he argued for group practice, rather than health centres. Group practice was evolutionary and was the only way to breathe life back into the finest, dying, elements of traditional general practice. Collings laid out a detailed and costed plan, both at practice and at national level. He discussed the staffing, the architectural design of premises and the management and personality issues that arose in groups. He considered the financial inducements required and the financial advantage to government – the better general practice became, less work fell on hospitals where care was expensive.
The British Medical Association (BMA) survey
Charles Hill, Secretary of the BMA, advised the Council that Collings, having been published, had to be ‘answered’. He suggested that a general practice review committee should be set up to obtain an authoritative and statistical report on general practice. Stephen Hadfield, an assistant secretary at the BMA, was given the job, mainly because he was the member of staff most recently in general practice. Throughout the next year, Hadfield visited four or five practices chosen at random each week.165 His report was fuller, more balanced and statistically based. Analysing his findings, he made judgements of quality of care: 92 per cent of GPs were adequate or something better; 69 per cent left no doubt that patients received what examination was necessary; three out of four paid reasonable attention to record keeping; and 7 per cent of both young and old GPs needed to revise the methods of diagnosis they used. Hadfield was surprised by how often the abdomen was examined with the patient standing and clothed. In 10 per cent of surgeries, the accommodation was dismal, bare, inhospitable and dirty. Some GPs were clearly discouraged when they saw the lines round the walls where greasy heads had rested or the marks of nailed boots on the floor. Relations with the hospitals were good and probably better than before the NHS, when voluntary hospitals kept outpatients to maintain high attendance records. With public health medicine, the position was worse. GPs saw district nurses as the salt of the earth, but reported little co-operation from health visitors and complained bitterly about them as a waste of nursing manpower. Hadfield believed that GPs, hospital consultants and public health doctors had to get to know each other better. They were treading different paths while the NHS was crying out for unified administration. General practice could follow one of two paths: either adjust to the situation and stimulate new clinical interests, or move towards an impersonal health service, taking general practice into a glorified hospital outpatient department.
There was a delay of a year before Hadfield’s report was published in September 1953. It was passed round the BMA committees because it contained comments about all branches of medicine. The chairman of the review committee wanted to publish, so that the profession might see the evidence and the public would know that the BMA was making a serious effort to raise the status of the GP and the standard of practice. If nothing else, it would make people think and start things moving. Others in the BMA thought that the report should be edited before publication, or should remain private, as it showed that not all GPs were quite ‘angelic’. The press would make capital out of shortcomings and some GPs would be angry. Yet published it was. Every profession, said the BMJ, has its quota of unsatisfactory practitioners; that a few should be outstandingly bad was only to be expected. The remedy was in better selection of students. Unsatisfactory relations with other parts of the service also impeded the work of the GP and the tripartite structure was a root cause of this. Finally the stresses created by the rapid advance in medical science over the previous three decades were responsible for some difficulties.166
Good general practice
The Nuffield Provincial Hospitals Trust, inadvertently responsible for stirring up the hornet’s nest, tried to remedy the situation. In 1951, Dr Stephen Taylor, doctor, medical journalist, Labour MP and a figure in the political background of the NHS at its inception, lost his seat in the election and was commissioned to examine the acceptable face of general practice. He was as selective as Collings but visited the best, some of whom had been recommended by Hadfield. They were the ‘doctors’ doctors’ with lessons to teach. He worked under the supervision of a steering committee of the great and the good, chaired by Sir Wilson Jameson, to avoid another cause célèbre. Taylor’s report, Good general practice, described its structure and organisation.167 Doctors who organised their practices were less stressed, more effective and happier. Whatever the perfection of the NHS administrative framework, Taylor concluded, “in the final analysis, the quality of the service depends on the men and women who are actually doing the job . . . good general practice begins with the good GP. So most of the conclusions are suggestions for self help”. The BMJ commended the book to all young practitioners.168 Taylor retained his interest in general practice, was involved in the establishment of a teaching practice at St Thomas’ Hospital, and was the moving spirit behind one of the earlier health centres, opened in 1951 in Harlow New Town.
The Cohen Committee
After Collings, the CHSC established a wide-ranging review in December 1950. Chaired by Sir Henry Cohen, Professor of Medicine in Liverpool, the committee included leading figures in the hospital and local authority worlds, several well-known GPs and Stephen Taylor. The conclusions of Taylor’s book, Good general practice, were submitted to the Cohen Committee. The Medical Practitioners’ Union (MPU), a national organisation of GPs dating from 1914 with Labour Party links, believed it had some answers. It suggested the development of group practice, revision of the payment system so that GPs were encouraged to spend money on improving their practice, the attachment of nurses and home-helps to group practices, and a salaried service for GPs.169 The Cohen Committee reported in 1954 and endorsed Stephen Taylor’s findings, but it was not the brightest of bodies and it produced no new thinking.170 Its value lay in its authoritative nature, seeing general practice as fundamental to health services. Practice could not be replaced by “congeries of specialisms, nor was it subordinate to them”. Cohen commended group practice, as it encouraged co-operation, and thought it might develop into the natural focus of the “various domiciliary arms of the health service”, securing the advantages of better staffing, accommodation and equipment more easily than health centres. Students should be given the opportunity to study the scope of general practice. More radical ideas were discouraged – long service or merit awards, assisting retirement of elderly GPs, or undergraduate teaching by GP academics.
One problem that GPs faced was the 24-hour commitment. Their contract was to provide a round-the-clock service. As independent practitioners, they had to find a substitute to cover holidays and leisure time. The first deputising service made its appearance in 1956 as a private venture of two South African doctors. Against the initial opposition of the BMA, and with no support from government, Solomons and Bane launched an emergency call service, providing duty doctors in cars with two-way radio contact to a central base. GPs, at least in London, now had a new way of covering their practices to give themselves time off duty.171
Improving general practice
Three factors helped the restructuring of general practice. First there was a change in the way family doctors were paid, which provided a financial incentive to improvement in ways both the profession and government desired. Second, innovative GPs began to paint a vision of practice as it might be, and sell the vision successfully to their colleagues. Articles began to appear, describing better systems of practice organisation, record keeping, appointment systems and the work of nurses.172 Third, professional organisations began to work behind the scenes to improve facilities, such as GP access to diagnostic services. The BMA was already involved. A quiet partnership between government, the BMA and the Royal College of General Practitioners (RCGP) moulded the most important ideas into a new policy. Donald Irvine, an Ashington GP (later Chairman of Council of the RCGP), listed its six elements:173
- Encourage groups
- Rehouse GPs in properly equipped, purpose-built premises
- Help individual GPs develop a viable organisation
- Give GPs access to hospital-based diagnostic services
- Introduce nurses and other health professionals to form primary health care teams
- Provide better postgraduate education.
Money, status and recruitment go hand in hand. GP pay was based on the recommendation of the Spens Committee, appointed in 1945 and reporting the following year.174 The starting point was a workload survey conducted between July 1938 and June 1939.175 Austin Bradford Hill, the statistician, said that, out of 6,000 doctors selected, less than 1 per cent refused to co-operate. Those who refused were too busy, or had unprintable views about the BMA, the Ministry of Health, statisticians or all three.176 According to the way the returns were interpreted, the annual consultation rate was somewhere between 4.81 and 5.39 per year. The baseline for earnings was the average pre-war income as declared to HM Inspector of Taxes. As GPs might not always declare their full earnings, this was an underestimate. Spens believed that the GPs’ average income was too low, in the light of the length of training, the arduousness of life compared with other professions, the greater danger to health, and the skill and other qualities required. Spens thought that, before the war, many doctors had been deterred from becoming specialists by the certainty of many lean years. The NHS would remove this deterrent, and if GPs were not well paid, recruitment would suffer and only the less-able young doctors would enter this branch of medicine, to the detriment of the profession and the public. Spens recommended a level above the pre-war average, and wished to see a system enabling good and energetic doctors to achieve substantial earnings. It left the adjustment to post-war values to others. GPs therefore entered the service paid on a provisional basis with the promise of a review. They rapidly and reasonably became dissatisfied with their earnings and a grossly inadequate betterment factor to bring GP pay up to 1948 levels.177 The review that had been promised did not materialise, and two years after the NHS began the Local Medical Committee Conference instructed the GMSC to make preparations for the ending of contracts.178 GPs had seen the Minister cut the remuneration of dentists and felt at his mercy. The dispute continued until 1951 when it was agreed to go to arbitration.
The Danckwerts award
The report by Mr Justice Danckwerts in March 1952 was a turning point. Taking account of inflation since 1939 and increases in the incomes of other professions, he recommended that the central pool divided among the country’s GPs should be increased to £51 million, a rise of roughly 25 per cent. The government had never expected an award of this size but was unable to avoid paying. The figures were related to the number of GPs rather than the size of the population, so if recruitment improved and list sizes fell, the average GP’s pay would not be affected even though the workload might fall. Danckwerts said that “if the number of doctors in the service became unreasonably large this point would require reconsideration”.179 It was clear to the Ministry that the size of the award made it possible to improve general practice. The government accepted it, subject to agreement on a system of distribution that would provide incentives and which could be done without obviously penalising the ‘back-woodsmen’. Within three months there was agreement on:
- changing the flat capitation rate to give a higher return to doctors with intermediate sized lists (500–1,500), so that new partners would be taken on more readily
- an initial practice allowance to make it easier for new doctors to enter practice
- financial encouragement to form partnerships and group practices.
The maximum number of patients a single-handed doctor could have was reduced from 4,000 to 3,500, which also became the maximum average for a partnership.180 The profession was broadly satisfied with the outcome and the award rapidly had the desired effect. GPs received a considerable sum in back-pay; some spent it on modernising their premises. The following year there was a net increase of 806 doctors and 1,118 new doctors joined partnerships. Long-standing assistants often became partners. The number who were single-handed fell by 312.181 It was an early demonstration of the effect of financial incentives on general practice. The profession agreed that £100,000 each year should be top-sliced to provide interest-free loans to group practices wishing to provide new or substantially better premises. This loan scheme was so popular that some applications could not be approved. In 1954, 36 applications were accepted, totalling £159,000.
Continuing disputes about doctors’ pay, and a threat by GPs to withdraw from the NHS, led (in March 1957) to the establishment of a Royal Commission on Doctors’ Pay in March 1957. Following the Royal Commission’s recommendations, the scheme was funded directly by the Exchequer and not from top-sliced money. Because it was impossible to identify precisely to whom money should be reimbursed, it was agreed to hold it in trust as a medical charity – the Cameron Fund.
Appointments systems, tried experimentally in a few places, had been shown to reduce the number of visits requested. A more even distribution of doctors was emerging as a result of the work of the Medical Practices Committee. There was a steady decrease in the number of patients living in under-doctored areas, from 21 million in 1952 to 9 million in 1956. Although the arrangements went some way to encourage group practice, it remained difficult for a small practice to find the funds to pay an additional doctor. There were comparatively few vacancies and two-fifths of them attracted over 40 applicants each. The easiest place to enter practice was the north of England, where list sizes were largest.182 Health centre development, which might have provided new posts, was minimal. The concept was unpopular with GPs, rents were high, and it took a long time to design and build health centres, partly because of the need for many parties to agree.
The College of General Practitioners
Two memoranda that proposed a college of general practitioners were presented at a meeting of the BMA General Practice Review Committee in October 1951. Stephen Hadfield, the Secretary, knew that Fraser Rose of Preston was interested in founding a college. At the same time, he discovered that a friend of his in private general practice, John Hunt, had a similar desire. John Hunt was invited to a meeting of the Committee and introduced to Fraser Rose. The two wrote a letter to the BMJ and The Lancet, published on 27 October 1951, proposing a college. It was like a breath of fresh air to many GPs.183 The idea was discussed for about a year and the strong opposition of the RCP, RCS and RCOG was clear, as was often the case subsequently when new colleges were in prospect. They would have supported a joint faculty of general practice within their own structures, but not a separate institution. Additions to their numbers risked weakening their influence; with few colleges, people listened when a leader such as Lord Moran spoke. In November 1952 the College of General Practitioners was formed in secret when the memorandum of articles of association was signed by the 16 members of the steering committee. The creation of a college, according to George Godber, provided “the banner with a strange device” that people could follow. The College ethos was, from the start, to lead from the front. It encouraged high standards of service, teaching and research, attracting theorists, for theorists cannot usually work alone. After six months, there were 2,000 members.184 Within four years it had developed 22 regional faculties. Although membership increased steadily, only a minority joined; in 1957 the membership was a little over 4,000. College influence was largely restricted to its membership and no responsibility was taken for the weaker brethren. Unlike the older colleges, membership played little part in professional advancement. The GMSC had wider responsibilities and was in a position to influence all GPs, as it did in 1954 when local medical committees were asked to inspect practice premises.185
The crux of the College vision was that family medicine had its own skills and knowledge base that were as important as anything the hospital services might bestow upon it. The work of men such as Keith Hodgkin, a GP, and Michael Balint, a psychoanalyst, was central to this. Balint, at case conferences at the Tavistock Clinic, cast new light on the nature of the consultation and was an important figure in the establishment of general practice as a discipline in its own right.186 He argued for a different type of education and research, and pointed to the relationship of the GP and the consultant as a perpetuation of the pupil-teacher relationship.187 One of the College’s first initiatives was to see what, if anything, medical students were taught about general practice. A survey published in 1953 showed that, although medical students from a number of schools visited GPs, and many schools were ‘planning’ some opportunity for the teaching of students by GPs, only Manchester and Edinburgh had such a teaching unit in the medical school.188 It was the beginning of a struggle to attain recognition of general practice as a subject entitled to a place in the overcrowded student curriculum.189
The College epidemic observation unit in Surrey began to plot infectious disease in the community. The Birmingham research unit, led by Crombie and Pinsent, was interested in mathematical modelling of general practice and took the lead in national morbidity surveys. Crombie, in a remarkable research project, ran surveys under the auspices of the College and the General Register Office. Between May 1955 and April 1956, careful records of a year’s consultations were kept by 106 practices, involving 400,000 patients and 1.5 million contacts.190 These practices provided a clear description of their clinical work. The study showed who was consulting GPs, for what, and what was being referred to hospital. Consultation rates for cancer, neurosis, circulatory and respiratory disease, and arthritis and rheumatism were provided for the first time and the surveys improved knowledge of the incidence and prevalence of most forms of disease. The CMO at the Ministry, Sir John Charles, thought it an important source of data that should affect decisions on medical student training.191
Towards a vision for general practice
Iain Macleod, the Minister, addressed the Executive Councils’ Association in October 1952 about the future.192 The BMJ thought it a refreshing and forthright speech in line with BMA policy. Macleod stressed the desirability of treating patients in the community and sending them to hospital only when medical or social conditions made it essential. This would increase the interest of general practice, be of benefit to patients, cut waiting lists and save money. Reduction of list sizes and the development of group practice would help. Co-operation between hospitals and GPs needed improvement, for example, by expanding direct access to X-rays and pathology departments that GPs were increasingly using. Without encroaching on the responsibilities of the local health authorities, Ian Macleod thought that the GP should be the clinical leader of a team within which the midwife, the district nurse and the health visitor should all work. The GP should also work more closely with dentists, pharmacists and opticians. There should be the same spirit of teamwork devoted to the patient in general practice as in hospital.193 A renaissance of general practice began, on a new model laid out by the profession and the Ministry.194
The Danckwerts award opened the path ahead but it did not solve all problems. Variation of practice standards remained a consequence of independent practitioner status, for while the energetic could improve their practices substantially and rapidly, not all GPs did, and their patients suffered. Enoch Powell wrote in 1966 that, whether the practitioner was good, bad (up to the point of incurring a disciplinary stoppage) or indifferent, he got the same payment for the same list. Inside general practice he could increase his earnings only by increasing the size of his list. The doctor was not primarily dependent on ability or reputation to increase his list, and in such competition as there might be, the doctor’s willingness to prescribe a placebo or the drug recommended by the patient, or to complete the desired certificate, might be as effective as skilled and conscientious care. The GP’s situation combined private enterprise and state service without the characteristic advantages of either. He could not reap the rewards of building up a practice, and the better he did his work, the worse off he was. Money spent on premises, equipment and staff did not increase his income, for the cost came from an income that would be undiminished if he did nothing. If he restricted his list to the number that could be treated properly, he merely ended up with a smaller income than less-able or less-scrupulous fellows. Powell believed that the essence of the private enterprise system, competition for gain, had been gouged out of family doctoring, leaving the empty shell.195
Local authority health services
The 1946 Act required local authorities to consult hospital authorities and the executive councils about their health service plans.196 The transfer of the general, long-stay, tuberculosis, infectious disease, mental illness and mental handicap hospitals to the RHBs substantially reduced their role in the direct provision of care, as did the proposed integration of preventive clinical services with general practice. Environmental sanitation was passing to engineering specialists, sanitary inspectors were becoming more expert and independent, and infectious disease seemed to be diminishing and to require collaboration with the PHLS, national and even international authorities. The role of the MOH changed from the development of services to helping services provided by others, co-ordinating them and reviewing their effectiveness. Those believing that public health should be managerial and deliver services saw the passing of a golden age.
However, local health authorities retained broad and important health functions and a few additions, enabling the MOH to maintain a role as guardian of the community’s health. Many, for example, George Townsend, the MOH for Buckinghamshire, accepted that there had been gains as well as losses, and quietly took the opportunities offered. Several components of health care had been put together for the first time and there was work to be done. Some services had been in difficulties, the voluntary nursing associations were inadequate and failing, and health visiting required reorientation. The NHS Act contained a provision enabling local health authorities to provide ‘care and aftercare’ that enabled them to develop facilities for the mentally ill and handicapped. Immunisation needed reorganisation, and the programmes had to involve GPs and be capable of prompt expansion. Maternal and child welfare and health visiting were already established; home midwifery had been under partial local authority control; and ambulance services were derived in part from wartime services.
From 1948 local authorities had full responsibilities for nursing in the community and the development of preventive and social support services, for example, the home help services. Some large authorities had appointed superintendent nursing officers before the NHS began and all now began to do so, developing leaders of the public health nursing team just as matrons in hospitals were looked on as leaders of hospital nursing teams. At first many used voluntary nursing organisations, such as the Queen’s Institute, as their agent. Rapidly, however, local authorities brought the nursing services in-house. Everyone now had access to care, and hospitals discharged patients increasingly rapidly, which meant that more acutely sick patients had to be cared for at home, altering the work of the district nurses substantially and revealing shortages of staff. Health visitors had once dealt with a host of minor problems. Now that everyone had a GP, these were taken to the family doctor. GPs were taking an increasing interest in mothers and babies, and it was possible that health visitors might be squeezed out of a viable role.197 In 1953, a working party was established, chaired by the then recently retired CMO Sir Wilson Jameson, to advise on the work, recruitment and training of health visitors. The health visitor’s role was defined as primarily health education and social advice. She should become a general family visitor, making a contribution in fields such as mental health, hospital aftercare and the care of the aged. The Jameson working party saw a need for co-operation with GPs, but dismissed the idea of attaching health visitors to particular practices, thinking that health visitors would work on an area basis.198
In 1954, MacDougall, MOH for Hampshire, provided health visitor support for groups of GPs in Winchester by attachment; a little later Warin developed a similar scheme in Oxford, as did Chalke in Camberwell, an inner-city area. Community nurses were coming into contact with a wider range of professionals and were now full professional partners and members of the general practice team.199
Health centres, first proposed in the Dawson report of 1920, were a local health authority responsibility.200 Part of the dream of the founders of the NHS, there was no practical experience of their pros and cons. Six months before the start of the NHS, the Ministry stated that, because of building difficulties and uncertainty about the best pattern to adopt, no general development of health centres was appropriate. This and GPs’ suspicions of a state service, an idea hopelessly entangled with health centres, slowed development to a virtual standstill. Two opened in 1952, a large and costly one (planned before the NHS began) by the London County Council (LCC) to serve a new housing estate at Woodberry Down201 and a smaller one, the William Budd health centre in Bristol.202 In the first 15 years of the health service, only 17 were opened. The health centres provided doctors, patients and ancillary staff with many advantages, and few disadvantages were apparent. Many GPs, however, used the centres only as branch surgeries.
Health promotion and disease prevention made a measure of progress in the first decade. One pioneer was John Burn, the Salford MOH, who established the first anti-smoking clinic. He helped the development of mental health services, and the use of nursing staff in immunisation and screening clinics. After the London smog of 1952, he was a member of the committee that engendered the Clean Air Act 1956, a massive advance in creating a healthy environment.203 But there was failure centrally to grasp the nettle of the growing consumption of alcohol, fluoridation or, most of all, smoking-related disease.
Hospital and specialist services
On the appointed day in England and Wales, the NHS took over 1,143 voluntary hospitals with some 90,000 beds, and 1,545 municipal hospitals with about 390,000 beds (including 190,000 in mental illness and mental handicap hospitals). Experienced and influential SAMOs, who, in their local authority days, had experience of hospital management, headed most RHBs. They understood the need to develop good specialist services accessible to the entire population. The demand for hospital care was rising. New surgical procedures for common conditions such as varicose veins increased the demand for beds, making it important to discharge people more rapidly. There was great pressure on both acute and long-stay beds, and continuous attempts to increase turnover and occupancy. As a result of the appointment of young well-trained consultants, the quality of provincial district general hospitals (DGHs) improved. Such was Kenneth McKeown, from Hammersmith and King’s, who was appointed to Darlington in 1950 as its first consultant surgeon.204 No longer did major surgical cases have to go to Newcastle, Leeds or London, and McKeown established the hospital as a centre for oesophageal surgery. For the first time, major developments emerged from district hospital specialists. They included Norman Tanner, who worked on the surgery of peptic ulcers at St James’ Balham, Harold Burge, who explored the results of vagotomy at the West Middlesex Hospital, and John Paulley at Ipswich, who showed the mucosal abnormalities in coeliac disease. Supporting them were better investigation and diagnostic services, with good pathology and radiology departments. Intervention was prompter, and improved anaesthesia, no longer a part-time activity for some GPs, meant safer operations for older people. The very success of the NHS created a problem. Even patients with emergency problems such as retention of urine, or with curable diseases might be difficult to admit. The BMJ drew attention to the shortfalls in the service; the dangers of going to bed, described by Asher, could be contrasted with the dangers of not going to bed.205
St George’s female medical ward, June 1951
Diagnosis |
Age |
Treatment |
Tuberculous meningitis |
38 |
Streptomycin/morphine |
Haematemesis |
67 |
Ascorbic acid, aludrox, thyroid, gastric diet |
Carcinomatosis |
80 |
Nepenthe, pethidine |
Right hemiplegia |
77 |
Ammonium chloride |
Subacute rheumatism |
27 |
Aspirin |
Pernicious anaemia |
71 |
|
Fractured femur |
70 |
|
Mitral stenosis |
33 |
For valvulotomy |
Investigation of headaches |
64 |
Codeine |
Costophrenic pleurisy |
40 |
|
Laparotomy |
65 |
Nepenthe, gastric diet |
Coronary infarction |
55 |
|
Ulcerative colitis |
44 |
Low residue diet, chiniofon infusion |
Acute rheumatism |
24 |
Salicylates, benadryl |
Thyrotoxicosis |
24 |
Bed rest, methyl thiouracil, phenobarbitone |
Polyarteritis |
30 |
Aspirin |
Coronary infarction |
66 |
Tromexan, complete rest |
Investigation of right kidney |
60 |
|
Sonne dysentery |
74 |
Thalistatin, barrier nursing |
Macrocytic anaemia |
71 |
Digitalis folia |
Tubercular peritonitis |
Streptomycin, PAS |
|
Congestive heart failure |
56 |
Digitalis folia, cardophyllin |
Duodenal ulcer |
34 |
Pethidine, gastric diet |
Subacute bacterial endocarditis |
22 |
Morphia, penicillin/streptomycin |
Investigation of lung |
49 |
Pethidine |
Almost all had daily blanket baths and night sedation.
The Portsmouth hospitals took the bold step of issuing a patient questionnaire. Half were returned and two-thirds of those were wholly laudatory. There were, however, suggestions. Perhaps the food might be warmer, and lavatories more available. The hair mattresses were lumpy and the wireless service could be better. Lack of privacy, of chairs for visitors and adequate visiting times featured among the criticisms. Could not mothers be allowed to handle their newborn babies more often before discharge?206
Hospital development
With limited materials and a strained economy, the government’s post-war priorities were housing and education. However, as money and materials permitted, thoughts turned to hospital building. Hospital surveys, such as the one for Sheffield with which George Godber was associated, had outlined a development policy. Sites should be large enough to allow for expansion, and the first new buildings on a site must be placed in a way that did not prevent this. Plans should be examined and approved by a central authority, informed by clinicians, matrons and administrators experienced in hospital work.207 In 1949, the Nuffield Provincial Hospitals Trust, with the co-operation of the University of Bristol, sponsored an investigation into the design of acute hospitals and established a team, led by Richard Llewelyn Davies, that included architects, statisticians, doctors and nurses. Its report, published in 1955 as Studies in the functions and design of hospitals, laid the foundation of future hospital design in the UK.208 An attempt was made to combine experience and new thinking, and to take advantage of good practice and new designs worldwide. The study examined the requirement for hospital accommodation, using information from surveys in the Northampton and Norwich hospital groups to estimate the demand from the surrounding area. It looked at the physical environment, heating, lighting, ventilation, the control of noise and fire precautions and it also covered the detailed design of individual departments. Throughout the study, architectural proposals were put firmly in the context of clinical policies and how staff worked.
Little new hospital construction was possible until 1955. Even then there was not enough money for whole new hospitals, only for individual departments, (for example, outpatients), and the replacement of antiquated plant in laundries and boiler rooms. The Ministry issued a bulletin on the most urgent problem, operating theatre suites, of which 700 were built in the first decade. Other building guidance followed. Teaching hospitals were now a national responsibility and perhaps a disproportionate amount of money was spent on them, particularly in London. It was necessary to decide how costs should be divided between the NHS and the universities. Most of the cost inevitably fell on the board of governors, but the areas used for teaching (e.g. seminar rooms) were a university responsibility. As to research, the NHS provided facilities for research on patients being investigated or treated, but other facilities such as animal-testing houses and research laboratories were a matter for the university.
Hospital management
While the teaching hospitals had retained their boards of governors and their traditional organisation, other hospitals had been grouped functionally under HMCs. The smaller voluntary hospitals, and municipal hospitals whose system of management owed little to the voluntary tradition, now had to work together. For example, Salford Royal Hospital, small but proud of its past, was now coupled with Hope, the municipal hospital, three times its size, part Victorian buildings and part pre-war modernisation, and an excellent hospital in its own right. In the voluntary hospitals, it had been traditional for there to be a partnership between the governing body, the house governor, the matron and the chairman of the medical committee representing the visiting staff. The municipal hospitals, however, had enjoyed little local autonomy. The medical superintendent was in charge, the matron and lay staff reported to him, and he to the MOH. The two types of hospital had to adjust to the new situation.
The Bradbeer Committee
The Bradbeer Committee was appointed to examine the situation for the CHSC in 1950.209 Bradbeer reported in 1954 that each hospital was a corporate body with a morale of its own that made for efficiency. The report commended the locally based partnership of medicine, nursing and administration that had characterised the voluntaries. Each hospital should have a medical staff committee with a consultant working part-time on administrative matters. At HMC or ‘group’ level, there should be a single administrative officer to whom the governing body could look for the co-ordination of all activities; he (or she) would not be a doctor and there should be a move away from medical superintendent posts. As chief executive officer, most business should be submitted through them to the management committee. After Bradbeer, the group secretary became more powerful and more distant from the clinicians and the matrons.
Hospital information systems
Changes in hospital staffing and activity
1949 |
1950 |
1951 |
1952 |
1953 |
|
Inpatient cases |
2.9 million |
3.1 million |
3.3 million |
3.4 million |
3.5 million |
Outpatients |
6.1 million |
6.2 million |
6.3 million |
6.4 million |
6.7 million |
Medical and dental staff (a) |
8,954 |
9,650 |
10,237 |
10,581 |
10,741 |
Nurses and midwives (a) |
125,752 |
132,408 |
136,210 |
140,964 |
144,558 |
Waiting lists (b) |
492,000 |
524,000 |
496,000 |
490,000 |
514,000 |
Bed turnover (c) |
9.5/year |
10.1/year |
10.7/year |
11.2/year |
11.6/year |
(a) Whole-time; part-time excluded.
(b) Includes mental illness and mental handicap.
(c )All specialties except mental illness and mental deficiency.
Source: On the state of the public health; annual reports of the CMO
Information about the hospitals’ clinical services was hard to find and would clearly be needed. From 1949 an annual return was required of all hospitals, showing the number of staffed beds, the number in use, their daily occupancy, the number of patients treated, and the waiting list for admissions on the last day of each year. However, this return was not available until it was months out of date and was not a tool for effective management. Shortly before the NHS began, the Ministry’s CMO, Wilson Jameson, asked George Godber to look at the problem and a team was assembled chaired by Sir Ernest Rock Carling, including Austin Bradford-Hill, Alan Moncrieff, Francis Avery Jones and Percy Stocks (a statistician from the General Register Office). A front-sheet was designed, simple enough for even the least organised hospital. It recorded key information: name, diagnosis and length of stay. In 1949 the Ministry invited volunteer hospitals to use this sheet and supply a 10 per cent sample of patient-based data for analysis, the Hospital Inpatient Enquiry. A step in the right direction, the enquiry relied on medical record officers choosing a random sample of case notes – and not, for their convenience, the shortest ones. The scheme became compulsory in 1957 and was run centrally by the General Register Office.210 Each year the number of beds available rose slightly and the number of cases treated increased by about 100,000, largely from more effective use of beds and shorter lengths of stay. There was little impact on waiting lists, which remained stuck at around half a million and were worst in general surgery, gynaecology and ENT. For tuberculosis, new methods of treatment, shorter lengths of stay, and the use of isolation beds to clear the backlog of patients all but eliminated waiting lists and made resources available for other types of work. Better use was made of existing facilities, but the effect of better planning was absorbed in previously unmet needs. Obstetricians were arguing for hospital delivery and mothers were responding. The performance of hospitals differed. Non-teaching hospitals generally had shorter lengths of stay than teaching hospitals. London teaching hospitals, on average, kept patients longer than those in the provinces and, in extreme cases, there were threefold differences.
London
During the blitz, the teaching hospitals had been forced to leave London. Some in the Ministry pensively hoped that not all would return, for post-war housing policy was to rebuild homes on the periphery, often in the new towns, and to move the population outwards. All, however, returned. The Goodenough Report (in 1944) and the Hospital Survey for London (in 1945) had argued that three teaching hospitals should move from central London: St George’s, Charing Cross, and the Royal Free. Bevan wanted the war-damaged St Thomas’ to move to Tooting, but he was persuaded to change his mind. In 1949 George Godber took him to St George’s to persuade that hospital to move to Tooting where general hospital facilities were needed. To help the selection of building schemes and discussions with London University, a new survey of the hospitals was launched in 1955. Four Ministry officials visited all London’s hospitals to see what changes might be needed because of the substantial movement of population outwards that was now taking place.211 They found that hospital development in the new areas had been slow and irregular, that some central hospitals such as St Mary’s still served large local populations, but others such as the Middlesex and St Bartholomew’s had falling local catchments. Lacking local facilities, the growing peripheral populations were increasingly dependent on central hospitals, so it became policy to develop a ring of DGHs in outer London. Teaching hospitals were at greater risk of losing their patients. Yet the University of London believed that the London medical schools, and therefore their matching hospitals, should be as close to the university precinct as possible, and opposed plans for relocation. Charing Cross, which had hoped to move to the new hospital being built at Northwick Park, had to remain more central, and the Northwick Park site became available to the MRC.212
Hospital farms
An unusual activity for a health service, left over from pre-war days, was hospital farming. It had developed mainly in conjunction with mental illness and mental handicap hospitals. The Ministry found that 190 hospitals in England and Wales were farming 40,000 acres without saying much about it. There were 3,800 acres of market garden and 4,000 acres of woodland. There were 7,000 cows and heifers, 25,0000 pigs, 5,000 sheep and about 63,600 poultry.213 Farming as a whole was losing money and there was a tendency to buy extra land to make the farms more economical. The Ministry felt that farming was being developed for its own sake, and included the maintenance of pedigree herds. In 1954 it was pointed out to the NHS that the Minister had no authority to run farms unless they were an essential part of a hospital; were the activities justified in each case? Regional boards set up small committees, and Sir George Godber told the story of rows about the future of a piggery in Kent. When it had reached a conclusion, the committee adjourned to view the pigs – which had all mysteriously disappeared as part of the hospital diet.
Medical education and staffing
Medical education
Medicine was one of the few degrees with a national control on student intake. From 1945 onwards, between 2,500 and 2,700 students were admitted annually and the medical profession was concerned that there might be too many doctors.214 In 1950 the BMJ said it was reasonable to accept the current size of the profession as satisfactory and not to expand it further. It was foolish to spend six years training someone who would then be given routine work that could be done better by a clerk or an auxiliary after six months’ instruction. By 1954, numbers had risen by more than a third since 1939. The BMJ pointed again to the risk of overcrowding the medical profession. It was doubtful if the country ought to be paying for the training of so many students; perhaps medical schools should reduce their intakes.215 One problem was medical immigration; hospital returns did not show the origin of junior staff, and it was not appreciated how many came from Commonwealth countries. More broadly, ensuring enough bright young people in the other professions, teaching, science and engineering was also important. In February 1955 a committee under the chairmanship of Henry Willink was appointed to estimate the number of doctors likely to be required in the long term.216 It included the great and the good, people such as Lord Cohen, Professor Sir Geoffrey Jefferson and Sir John Charles. Two points of view were put to the committee: first, that an adequately staffed, comprehensive and rapidly expanding service needed more doctors; second, that too many doctors were already being trained for the positions likely to be available. Even before the recommendations were published, some medical schools cut their entries because they had been swamped with ex-servicemen taking medicine, as well as the normal intake of 18- to 20-year-olds and sometimes substantial numbers of students from overseas. The committee, having reviewed each branch of the profession, concluded that there was indeed a risk of overproduction. Because it took at least five years to train a doctor, the numbers in the pipeline were already determined but, from 1961 to 1975, a reduced student intake would put the numbers back in balance. After that, some expansion would again be needed. The committee arrived at the wrong answer, largely because of a lack of appreciation of the numbers emigrating and immigrating. Willink’s name became a byword for disastrous planning.
Women played a small part in the medical staffing of the health service but their numbers were rising. Because of the recommendation of the Goodenough Committee that medical school funding should be dependent on a policy of co-education, this became the norm. In 1948/9 there were 2,931 women medical students compared with 10,281 men. In London, at University College Hospital and King’s College Hospital, the ratio was 1 to 5. The nine other schools remained the stronghold of the male. Three ‘lagging behind in gallantry’ were Guy’s, the London, and the Westminster, where less than 5 per cent were women.
The aim of medical education had been to produce ‘a safe doctor’. On passing finals, a student could, in theory, practice immediately without further supervision. The RCP, in evidence to the General Medical Council, said that it was no longer possible to give a full training in all branches of medicine before qualification, and the attempt to do so should be abandoned.217 From 1 January 1953, full registration for unsupervised practice was not granted without proof of post-qualification experience. Newly qualified doctors had to work in a resident medical capacity at an approved hospital, institution or health centre, for 12 months. Usually this meant six months as house physician and six as house surgeon. At the end of a year they could, in theory, do anything, although junior hospital doctors continued under supervision and, if entering general practice, it would usually be as an assistant.
The specialists of the future were educated in the environment in which they would be working. That was not so for general practice, because undergraduate and postgraduate education was hospital-based. Marshall Marinker called it a colonial epoch, with journals carrying good news from the hospital to the GP.218 However, the BMA under the chairmanship of Henry Cohen, reviewing medical training in 1948, considered that there might be a GP component of undergraduate training, that GPs might be on the teaching staff, and that students might visit practices.219 In 1950, a second committee recommended that future GPs should have a year of supervised practice, though nothing was said about the quality of the trainer.220 The Goodenough Report had stressed postgraduate education. Sir Francis Fraser, formerly Professor of Medicine at St Bartholomew’s and during the war Director General of the Emergency Medical Service, was appointed to develop postgraduate medical education in London. Failing in his ambition to establish a postgraduate teaching hospital in Bloomsbury, he welded the postgraduate institutes into a single school of the university, the British Postgraduate Medical Federation. His experience of wartime organisation had led him to the idea of regional postgraduate education long before the introduction of the NHS.221
Hospital medical staffing
From the outset there was a significant difference in the approach to manpower planning for GPs and for consultants. GPs were independent contractors appointing their own successors and colleagues. There were few controls other than a prohibition on entering over-doctored areas. It was largely up to the GPs to decide whether they wanted, or could afford, to expand their practices by accepting more patients or taking on a partner. Government wanted an adequate number of reasonably trained GPs rationally distributed and was not too concerned about the details.
It was different for consultants, who had chosen to be employees. A career structure based loosely on the pre-war hierarchy of juniors in the teaching hospitals was put in place (consultant, senior registrar, registrar, senior house officer and house officer). Pay of consultants and juniors was based on the reports from the Spens Committee, of which Lord Moran was a member. Key recommendations were that there should be equality of remuneration between different branches of specialty practice, and equality of status between different hospitals.222 If those in prestigious fields were to earn more than others, and the pay was to be greatest in teaching hospitals, there would be no hope of providing a full service throughout the country. Spens recommended that there should be distinction awards allocated by a predominantly professional body, to provide an adequate reward for those of more than ordinary ability. Specialists who undertook teaching responsibilities should also have a claim to higher pay. The Spens reports established a basic grade, equal in all specialties and places, but it looked more equal than it was. Merit awards were slanted towards general medicine and general surgery, the regional specialties and academia.
Permanent consultant posts were not established immediately; in the first year, each region was required to set up a review committee with two outside assessors from the Royal Colleges to grade hospital staff. They had to decide how much time should be spent in each hospital, and which individuals should be regarded as consultants. Some, though able to make a valuable contribution to the NHS, were considered below this standard. Many of these were in the tuberculosis service or psychiatry; 2,000 senior hospital medical officer (SHMO) posts were established for them, and they were offered the chance of a later review. Some GPs who had worked extensively in hospital were graded as consultants. Many who had previously held staff appointments turned wholly to general practice or found that specialists had been brought in to take over from them. Over two years, the move towards specialism, which had been taking place slowly throughout the century, was completed. The availability of health service finance for consultant appointments accelerated the process of professional evolution and the profession was now divided clearly into consultants and GPs.223
In 1948 there were about 5,000 consultants. Establishments could not be brought immediately to the level set out in the memorandum on the Development of consultant services.224 It took a long time to train specialists and there were severe shortages in pathology, psychiatry, radiology, anaesthetics and paediatrics. Some regions, for example Newcastle, North West Metropolitan and Oxford, moved ahead of others, getting staff while money was still available.225 Many senior registrar posts were established, particularly in general medicine and surgery, often when the real need was for more consultants. Early statistics suggested that there were twice as many senior registrars as were likely to find consultant posts. In 1950/1 regions were required to appoint small committees of senior or recently retired specialists to give their views about specialist staffing. Some of their estimates were clearly too high and there were such bizarre differences between regions that making the findings of the review public was quite impossible.226 The Treasury took fright at staff costs and the teams were quietly stood down. A central Advisory Committee on Consultant Establishments was established, chaired by George Godber, which included the JCC and professional advisers. It worked constructively, examining all applications for consultant posts, channelling them to the regions in greatest need, and trying to reduce senior registrars in overcrowded specialties such as general medicine and general surgery, and increase those in anaesthesia, psychiatry and pathology. Consultant numbers slowly increased by about 200 a year but regions did not always get what they wanted. In the early 1950s the South West Metropolitan RHB wanted to improve psychiatric services in the cluster of hospitals near Epsom. It applied for 20 psychiatrists in a single year, equivalent to the entire UK training programme. Sometimes those in general specialties objected to the appointment of colleagues who might, as in dermatology, relieve them of an interesting facet of their work.
The position of young doctors was given less attention. From 1952 controls were imposed on the senior registrar grade. There were 2,800 senior registrars in post, although the career structure required only 1,700, and consultants were being appointed at 38–40 years of age, instead of at 32–35. The Ministry helpfully pointed to the vacancies in His Majesty’s Forces and the Colonial Medical Service.227 When the growth in numbers of senior registrars was stopped, the registrar grade grew unchecked. Registrars had not committed themselves to particular specialties, and the grade was often used to help staffing problems. This mistake had far-reaching effects for which the health service is still paying.228 Some registrars were prepared to pursue a slim chance of ultimate appointment as a consultant rather than enter general practice.229 The position was only made worse by attempts to restrain growth of the consultant grade as an economy measure.230
Doctors’ pay and the Royal Commission
Doctors’ pay became a major cause of dispute. Spens had suggested a starting point based on 1939 money, leaving to others the problem of adjusting this to ‘present day values’. A differential had been established between the consultants and the GPs; the Danckwerts review had increased the GPs’ pay substantially, closing the gap. There was also concern about cost-of-living adjustments. In 1955 the BMA put forward a betterment factor of 24 per cent to cover the period 1951– 1954; the Ministry of Health did not agree. The BMJ said:
This one-sided tearing up of a treaty is something which neither the profession nor we believe the public will in any circumstances tolerate. The recent replies from Ministry spokesmen are what we might expect from the Artful Dodger but not from men in a responsible position.231
The Times was similarly attacked by the BMJ. The professional classes as a whole were being squeezed out of decent existence. It was not only their economic position that was at stake but also a way of life that, with all its faults, was a powerful force for good in the country.232 The government’s repeated refusal to deal with pay claims on the basis of the Spens recommendations was seen as a breach of faith; possibly a breach of contract which should be tested in the Courts. Ministers in succession found reasons for inaction; Spens could not be afforded, it was inflexible or unrealistic. Perhaps something new should be sought.233 Lord Moran said Spens could not be thrown on the dust heap merely because it subsequently proved inconvenient. The government, shaken by the size of the Danckwerts award to GPs, finally repudiated Spens in 1957. It denied that it formed the basis of a contract, implying that doctors could challenge this in court if they wished.
In March 1957, Harold Macmillan, the Prime Minister, announced a Royal Commission on medical pay. It would look at medical earnings in comparison with the other professions, rather than upgrading pay in line with inflation. Sir Harry Pilkington, Chairman of Pilkington Ltd and a director of the Bank of England, was the Chairman. Punch published a David Langdon cartoon showing a Greek physician expostulating with Hippocrates about his new oath – “This is all very fine, Hippocrates, but there’s nothing here about pay”.234 The doctors thought that comparisons might be misleading and initially the BMA refused to co-operate. GPs were 24 per cent worse off than they had been in 1951 and were threatening resignation. The consultants wished to take whatever action they could; some were considering emigration. By May, however, there were new assurances. An exchange of letters between the Prime Minister and the profession led the RCP (rapidly and somewhat eagerly), the GPs and the doctors more generally to accept the Commission and to submit evidence. The BMA did so in November 1957.235
Nursing
Nurse education and staffing
There was no provision in the NHS Act 1946 for the training of nurses, and no organisation within the service charged with the responsibility for it. Bevan was well aware of this, and the Ministry made farsighted proposals after the Wood Report (published in 1947). During lengthy discussions preceding the passage of the Nurses Act 1949, the nursing organisations whittled away ideas such as student status for recruits to nursing, and new training bodies separated from hospital management. They turned down the very reforms which they later struggled for many years to achieve. The most significant development was probably the growth of experimental forms of training.236 [Training of nurses had been a major issue since the time of Florence Nightingale and registration – fought for by Mrs (Bedford) Fenwick – dated from an act of 1919 coming into force in 1923. This badge, no 1001, was one of the first issued.]
From the outset there was a grave shortage of nurses, and many hospitals were critically dependent on students. For 600 beds, Aberdeen Royal Infirmary had 93 trained staff and 330 students.237 The NHS was reckoned to have 48,000 too few nurses, so that, on the one hand there was a need to expand the labour force, and on the other, an awareness of the risks of diluting a skilled staff by unskilled and semi-skilled people.238 Nurses were afraid there would be direction of labour, as in wartime, and that they would be sent to any hospital where there was a severe staff shortage. Bevan told them there was no power of direction; at most they would be asked – not ordered.239 State-registered nurses were supplemented by state-enrolled assistant nurses who undertook a shorter training and, in theory, were restricted to more limited roles. There was also a shortage of midwives as a result of public demand for more hospital confinements.240 There was grave concern about the staffing of sanatoria, chronic sick hospitals, mental illness and mental deficiency hospitals. Better methods of treating tuberculosis solved the first problem, and only slowly did a new outlook on the care of the elderly chronic sick, together with the grouping of hospitals, make their care easier. The problems of the mental illness and handicap hospitals were approached by attempts to select students more carefully to reduce wastage, recognition of the nursing assistant as an essential member of the team, and secondment of student nurses from general hospitals to gain experience in mental illness nursing as part of their training.
It was recognised that, although many student nurses enjoyed their training, until conditions improved in the worst of the hospitals, students and trained nurses would continue to leave. On the other hand, until the country secured more nurses, it would be impossible to improve the conditions which nurses complained about.241 Nursing absorbed a large and increasing proportion of young women entering the job market. The Minister of Labour, Walter Monckton, said that, in 1939, there had been 160,000 nurses in the country, but by 1952 this had risen to 245,000. The number of women reaching the age of 18 had, over the same period, fallen by 100,000. Twenty-one thousand entered the nurse training schools annually, a high proportion of those with appropriate educational qualifications. Although the NHS would have more things to do, there would be no more people to do them. Policies would have to conform to that reality.242 Nationally, the ‘wastage’ in the student years was 55 per cent. Before the second world war the General Nursing Council (GNC) had insisted on a minimum education level for recruits to nurse training, either school certificate or the GNC’s own test. This requirement was dropped on the outbreak of war and not restored afterwards. The educational level of nurses had fallen, save in large voluntary hospitals that had been able to maintain an entry requirement and still be selective. The official policy of both the GNC and the RCN was to re-introduce a minimum educational level, but there were internal divisions and neither the Minister nor the hospital authorities wished to take the risk of making matters worse.243 St George’s, Hyde Park Corner, was among many hospitals wishing, as Wood had suggested, to improve selection and reduce wastage. A wide and varied group of performance tests were given to a group of 126 nurses who were also assessed by three independent judges on a rating scale covering 18 traits of personality and ability.244 Intellectual capacity and personal relationships were found to be the key characteristics of the good nurse, and it was hoped that selection based on these principles would reduce the number of unsatisfactory candidates accepted for training.
Lord Horder’s Nursing Reconstruction Committee (1942–1950) issued its third and final report on economic factors and nurse recruitment.245 Fifteen thousand new students were needed annually and, unlike entrants to most professions, nurses gave their services while learning. Hospitals regarded nurses as cheap labour, and there was no reason now, in a state service, for students to continue to subsidise the NHS at the expense of their own training. Students asked for practical bedside training, and for teaching that related theory to practice.
Horder recommendations (1950)
- Bedside work essential for training
- Hospitals not to exploit students
- Part-time working to be encouraged
- Adequate pay for all nursing posts; equal pay for equal work
- Nurses should help shape policy.246
Something was wrong with nursing; Professor Revans of the Department of Industrial Administration, at Manchester University, was funded by Nuffield to study the profession. His work suggested that nursing was a profession in transition. It had developed at a time when there were more women than jobs. Nursing and domestic service had been seen as God’s ways of ensuring that the idle fingers of middle- and working-class women were ‘not led into wickedness by the Devil’. Obedience was paramount and authority was worshipped. As a result, hospitals, while attracting a large number of recruits, were careless in their handling and blamed the young women for leaving rather than themselves. Hospitals had widely varying levels of sickness and wastage; both were functions of the hospitals’ management. While student nurses had many grouses, the greatest was the fear of not being up to the job. Only the ward sister could give her confidence, and ward sisters had many other problems to cope with. Hospitals must address the problem; the age of authority and abundance of cheap labour was coming to an end.247
The GNC revised its own training syllabus to include preventive and social issues as well as curative aspects of nursing. In 1956, the RCN published a statement of nursing policy. It reviewed established principles of the nursing profession in the light of social and economic change and developments in medicine, taking into account the recommendations of Lord Horder’s committee. The College looked at both ‘horizontal’ and ‘vertical’ issues. Horizontally there was the need to sustain recruitment while maintaining standards of entry by careful selection. Nurses in training should be given the tasks important to learning rather than to the hospital. Nursing teams, under the direction of a state registered nurse, were in the best interests both of the patient and of conserving nursing resources. ‘Vertically’ the profession should develop its leadership and look to the future, bringing into the profession more trained minds with a broad outlook, perhaps through a university degree course. In future nurses should be involved in health service management, as in the tripartite teams of hospital administrator, physician and matron, and make a nursing contribution to policy, for example, on management bodies, the Ministry and the CHSC. Training for leadership and to develop nursing on a factual and research basis was therefore important.248 One opportunity was the University of London diploma in nursing, a two-year part-time course for nurses, both in hospital and in public health. It covered basic medical sciences, preventive and social medicine, social psychology and modern nursing developments. Many of the profession’s high-flyers took the diploma. Was there a place for a higher qualification? If it were to be accorded a place in a university, nursing must demonstrate its own principles and laws; it must be neither lesser medicine nor a phase of social work, but valuable in itself. Academic studies would have to be strictly relevant to the practice of nursing, as medical education was relevant to clinical practice.249
The influence of North American nursing
For the next 50 years, British nursing was continuously under the influence of developments in North America, even though, in the view of Virginia Henderson, an outstanding American professional leader and educator, the relationship of the doctor and the nurse in the USA was not the same as in the UK. American doctors prescribed nursing care, but nevertheless might feel threatened by the experienced nurse, there being more friction than in the UK.250
Nursing in the USA had a long-standing academic basis, while it was only in 1956 that the first British nursing studies unit was established, in Edinburgh. A course in hospital economics for nurses at Teachers College, Columbia University, New York, had been established in 1899. Under the leadership of Adelade Nutting, a Johns Hopkins graduate and former superintendent of their nursing school, the course grew into a nursing department offering a certificate programme, a bachelor’s degree and later a graduate programme. From the beginning the Teachers College programme was under pressure to provide nursing with skilled and well-trained educators and administrators, and by the 1930s it had become a cornerstone of nursing education. Virginia Henderson, and later on its staff, pointed out that, in the early days of nursing research when doctoral degrees in nursing were not available, nurses obtained degrees in sociology, anthropology or psychology instead and would naturally emphasise these disciplines when they began teaching; hence the dominance of social sciences in the American nurses’ curriculum.
Nurses in the USA struggled to achieve autonomy as individual workers and as a profession, against hospital management and the existing culture of nursing itself. The general culture assumed that:
the nurse’s enduring authority should come from gender, not science; her place of work was the bedside or hospital, not the laboratory. Hospitals, in turn, demanded that nursing provide them with a workforce, not a research team. Physicians primarily wanted assistants, not colleagues. Working nurses often wanted reasonable hours, not more education, and nursing educators believed in science, but could not agree on its meaning.251
American academics tried to redefine and change nursing and nursing education. British nurses often went to work or to attend conferences in the USA to see what was happening. Articles appeared in the Nursing Times describing systems in use there, such as team assignment.252 The Wood Report had proposed a two-year course, and the separation of nursing schools from the hospital administration. The Nursing Times reported such an arrangement in Windsor, Ontario, which ran from 1948 to 1952. The school was a university institution and controlled the students’ time so that bedside clinical experience could be integrated with the course syllabus. There was no conflict, as there was in hospital schools, between the provision of a service and educational requirements. The students liked the course, liked nursing and continued to nurse. However, in spite of worldwide interest, the system was ended, in part because of the opposition of hospital management, the doctors and the nursing profession locally.253 A similar experiment was funded by Nuffield at the Royal Infirmary, Glasgow. It began in 1956 to test a more educational and less vocational system of training. Students were resident in the school, not the hospital, and took a two-year course to their finals, followed by a year as a member of the hospital staff before registration.254 St George’s ran a similar course.
Nursing practice
Nurses were having to adapt to an ever-changing pattern of patient care. Only a short time previously almost all patients were, at some stage in their illness, completely helpless. Now the aim was to avoid the need for total care or to diminish its duration as much as possible.255 Nurses needed to go beyond physical needs and consider the relief of anxiety and pain. Earlier discharge from hospital to the community also altered the pattern of the district nurse’s work because continued supervision might be required.256
The Nuffield job analysis
After the Wood Report, the Nuffield Provincial Hospitals Trust explored the ‘proper task of the nurse’ and undertook a job analysis of their work in hospital wards, directed by Mr HA Goddard. Nuffield selected hospitals with nurse training schools, so there were no data on hospitals for the chronic sick, a significant gap because some of nursing’s worst problems were in the chronic wards where student nurses were seldom seen.257 Minute by minute, day and night, the activities of nurses of all grades were tracked. Published in 1953, the report demonstrated that what was happening in the wards was not what people thought. It called for a restatement of nursing theory:258
- The special province of the trained nurse was satisfying patients’ human needs, not just skilled technical nursing
- Nursing should be done by trained nurses, not supervised by them
- Trained nurses should be responsible for the total care of a specific group of patients
- Undisturbed rest for patients was not possible as the day lasted from 5am to 10pm
- The time spent by sisters teaching student nurses was negligible
- The end-result of nurse training seemed to be not nursing but administration.
The trained nurse might still attempt to cover all the tasks concerned with the care of the patient, but in practice she could no longer do everything, and many tasks were undertaken by student nurses and orderlies. Basic nursing took up 60 per cent of the time of a first-year student nurse, but as she became more senior, she did less of this and an increasing proportion of ‘technical nursing’. The heavy contribution made by student nurses to basic nursing exposed the problem, with the recommendation by Wood for ‘student status’; if education was to take priority over service demands, who would do the work – more auxiliary help on the wards?
Sisters who thought they did much teaching, spent half their time on ward organisation and only five minutes a day with student nurses. There were two possible lines of development: the nurse could become recognised as a technician, or she could insist that the basic and technical aspects of nursing were indivisible. In the USA, the head nurse, graduate nurse, practical nurse and nursing aide were each responsible for a particular aspect of the nursing care of a group of patients. The danger was, however, that both basic and technical functions originated in human need and were hard to divide. An auxiliary making a bed might not notice the worsening condition of a patient that would be immediately apparent to a trained nurse. Ward sisters had a particularly difficult role, responsible at the same time for the care of patients, administration of the ward and training student nurses.
The study also showed the inhumanity of a system that gave sick people little time to rest during a 17-hour day. Nuffield established an advisory panel to comment on the results of its enquiry. The panel said that nursing should be done by trained nurses, not merely supervised by them. Basic nursing should not be delegated wholly to an auxiliary grade, although a ‘second pair of hands’ was desirable. Nursing skills should be conserved by the reallocation of many non-bedside tasks, and wards should be divided into a number of nursing teams, each the direct responsibility of a trained nurse. Goddard, the director of the enquiry, was convinced that staff were not used to best advantage, and that there was, in fact, an adequate number of nurses. When hours were spent moving screens about the wards, chaperoning doctors, or on tasks not requiring their skills, the problem was one of maldistribution.259 The Nuffield project suggested that nurses themselves owed it to their patients to be more active in research, as were the American nurses.260
- Job assignment
- Sister delegates to the nurses different duties, which each nurse carries out for all patients in the ward
- Team nursing
- Nursing personnel are divided into two or three teams, where possible a staff nurse acting as team leader, the teams including an assistant nurse, student or pupil nurses and perhaps a domestic orderly. The staff nurse considers the needs of the patients and delegates duties according to the skills of the individuals
- Case assignment
- Each nurse is responsible for the total care of a certain number of patients, conducive to seeing the patient as a whole person and considering all his needs, social, mental, spiritual and physical.
Source: Catherine Hall, RCN General Secretary: Nursing Times, 2 May 1958.
An RCN official said that patients were being nursed more and more in bits: student nurses did all the basic and most of the technical nursing, and the qualified nurse forsook the bedside for administration.261 The House of Lords considered the Nuffield Report, and Lord Woolton, speaking for the government, said that what the nurses needed was reorganisation. There could be administrative support and greater use of orderlies.262 Lord Moran (ex President, RCP) said that, while he hoped that administration was not the peak of every nurse’s ambition, regrettably it represented promotion and was better paid. Moran argued in favour of ‘dilution’, although this was controversial. There was already dilution in medicine; nurses did jobs the doctors had done years previously. The Minister, Ian Macleod, asked the Standing Nursing Advisory Committee to study the report and patterns of ward organisation. Experiments, particularly in ‘team nursing’, were set in hand.
There was a five-year trial at St George’s, led by the matron, Muriel Powell. Patients were divided into small groups of 9–13 patients, each allotted to a separate team of nurses led by a staff nurse. Team methods were based on the principle that good nursing involved the total care of patients, and student nurses liked it because they could practise total nursing within the team. It was a compromise between job assignment that was cheaper but might be associated with poorer care, and case (patient) assignment that was too expensive.263 On the whole, a team system produced high-quality personal and technical nursing, and staff satisfaction. There were, however, problems; team nursing was designed to produce a higher quality but not a greater quantity of nursing care, so it was less adaptable at times of pressure and crisis than job assignment. It was not used at St George’s at night. The ward teams sometimes competed with each other, even for equipment they wished to use simultaneously.
The American literature suggested that team organisation ensured better supervision of auxiliary nursing staff and was more democratic; British literature stressed the more responsible job for staff nurses, with wider responsibilities. It seemed important to keep teams as small as possible, consistent with adequate trained supervision. Muriel Powell also tried case assignment.264 Junior student nurses were given two patients, and seniors five. Students learned quickly, but the young nurse might identify too much with the patient if he was very ill. Routine duties might be ignored; the ward steriliser might boil dry.
In hospital, trained nurses might provide only 25 per cent of patient care; in the district, it was nearer 100 per cent. Local authorities developed training schools for their staff. For example, in Essex the scheme, opened in 1951, provided experience for student nurses, a part II midwifery training school and theoretical training for Queen’s (district) nurses. There was a central nurses’ home, for many of the district nurses were resident; by 8.30ama fleet of cars and bicycles were ready to leave the home in all directions, as the district nurses went to work.265
Dame Elizabeth Cockayne, Chief Nursing Officer at the Ministry from 1948 to 1958, talked on the eve of her retirement about changes in nursing practice.
We find more physicians discussing patients’ problems with the nursing team and we have seen the nurse-patient relationship change with the progress in medicine. The patient’s point of view is given more attention today, indeed the patient is part of the team. We find ourselves doing things with patients, and not just for them as previously, leading them to self-direction and graduated degrees of independence. As a profession we need to become increasingly self-analytical, and to examine what we are doing and why.266
The image of the nurse was beginning to matter. The Nursing Times was displeased with the BBC for its production of a documentary about student nurse training, ‘Under her skilled hand’. The script did not reflect the dignity and sincerity of the title. What would have been the impression of parents whose daughters were considering nursing as a career?267
Nursing uniform
Nurses’ uniforms could always stimulate debate.268 Some saw them as a proof of the nurses’ competence and a reassurance to the patient. They viewed any threat to them as an attack on professional dignity. Others held the nurses’ cap to be a relic of religious practice and the long starched apron from the base of her starched collar to her ankles to be a hygienic precaution. Now both had shrunk in size to become more a badge of office than a part of hygiene. Serving no practical purpose, some thought the uniform might be banished. The styling and eminently simple but well-cut dresses of American nurses might be envied. “Were not British uniforms old-fashioned, difficult to launder and hide-bound by tradition?” asked a student nurse in the Nursing Times.269
Nursing administration
The Nurses Act 1949 implemented some of the less contentious proposals in the Wood Report. The remit and membership of the GNC was broadened and Area Nurse Training Committees were established. The function of these committees, placed between the GNC and the nursing schools, was vague. At a senior management level, when the RHBs were being established, nursing organisations were asked for nominations as members. The RHBs appointed their own senior staff including nursing advisers, the future regional nursing officers (RNOs).
In the hospitals, the role and the pay of the matrons varied according to the number of beds. Those in the teaching hospitals were secure in their power and their posts, responsible to their boards, and independent of regions. Their main concern was to ensure that the board understood that its wider policies might affect nursing. Matrons ran the schools of nursing as well as being responsible for the running of an efficient nursing service. At the London Hospital, the matron looked after not only the nursing school, but also the schools for radiographers, physiotherapists, occupational therapists and dieticians. Matrons were responsible for the linen room, laundries, female domestics, catering and other departments, controlling many services affecting the patient’s environment. A member of Matron’s office staff was often the most senior person resident in the hospital at night and the weekend, taking decisions well outside her purely professional capacity.
In the smaller hospitals, matrons had less authority, for up to a dozen hospitals might be grouped within a HMC. The group secretary could not consult all of them about everything, yet each felt herself autonomous and neglected. Far from attending meetings of the HMC, the matrons often did not even see the minutes. How did the HMC get nursing advice? Within the groups, division on functional lines was taking place. Initially the catering officers, supplies officers and domestic supervisors, although undertaking duties previously carried out by the nurses, remained under matron’s authority. Following the Bradbeer Report, domestic tasks passed increasingly to lay administrators. Often the matron’s precise responsibilities were not laid down in a hospital’s standing orders, and they found themselves appointing and dismissing staff on the basis of traditional practice, without any written authority to do so.270
Emerging problems
Financial disparities and rising expenditure
The initial allocations to the RHBs were not equitable, but the way in which the NHS accounts were presented tended to conceal regional disparities. Expenditure was presented under ‘functional’ subheads, for example, the cost of nursing staff by grade nationally, not by region. Regional allocations, settled each year, were composed of two elements: a static or inherited element to keep the service running at the existing level; and a developmental element to cover new services. From 1951 to 1954, the Acton Society, an organisation concerned with the place of large-scale organisations in society, was funded by Nuffield in 1951 to examine the organisation of hospitals under the NHS. The Acton Society recognised that the Ministry was trying to improve matters, but doubted whether the attempt to ‘level’ the allocations had gone far enough or had been worked out on a fair basis.271 The Ministry’s policy was to use its discretion over the development element to level up the more needy regions. Over the first decade some slight progress was made. The share of one group of regions (Newcastle, Sheffield, Birmingham, Manchester, Liverpool and Wales) increased from 39.11 per cent to 42.22 per cent. The richest regions, the metropolitan boards, fell from 41.72 per cent to 38.30 per cent, and the remainder were stable (Leeds, East Anglia, Oxford and South Western). The Acton Society thought this reasonable, particularly as little evidence was available on the efficiency and economy of different kinds of hospital, taking adequate account of the nature and the quality of the services provided.
It was a long-standing socialist belief that a state medical service would save money. In 1911 Lawson Dodd wrote:272
The economy of organisation, the greatly lessened cost of illness due to the increase in sanitary control, and the immense amount saved in the reduced number of working days lost through illness, would make the health tax seem light, and it would be regarded as a profitable form of insurance.
In the Beveridge Report (1942)273 the Government Actuary said that the fundamental changes envisaged could result in the costs differing materially from the estimates that had been made. However, the report itself stated that the development of health and rehabilitation services would lead to a reduction in the number of cases requiring them. Beveridge, like Lawson Dodd, looked forward to a service that would diminish disease by prevention and cure, and believed that future developments would reduce the number of cases requiring health service care. Enoch Powell, in 1961, referred to this as a miscalculation of sublime dimensions.274 He thought that, in theory, it would be possible to put together a package of health services limited to those that would maximise the gross domestic product, concentrating on people who had a substantial period of productive life before them. The weakling, the old and the subnormal would be left to die. Powell considered that such a health service would be scarcely conceivable, even in a nightmare dictator state. It would not be a health service at all. It seemed virtually certain that the increasing outlay as medical science progressed would be more and more ‘uneconomic’. Progress in medicine consisted not of doing things more cheaply and simply, but in discovering complex and difficult things to do that previously could not be done at all. Medicine was buying life at an ever-increasing marginal cost.
The government had moved into strange territory. A free and universally available service on this scale was highly unusual. The provisional estimates of costs for the first year were based on past hospital accounts, some of which were sketchy in the extreme. They were rapidly exceeded. In 1946, when the NHS Bill went to Parliament, the estimate of the total net cost annually was £110 million. At the end of 1947 it was £179 million. At the beginning of 1949 a supplementary estimate of £79 million was added and the figures turned out to be £248 million. The actual cost in 1949/50 was £305 million. The following year it was £384 million. The government became alarmed.
Analysing the difficulties
Dr Ffrangcon Roberts, a radiologist at Addenbrooke’s, was an early and perceptive commentator.275 Early in 1949, he drew attention to the unreliability of the predictions because of three factors:
- they ignored the effect of the ageing population
- they ignored the intrinsically expansile nature of hospital practice; previous government experience had been of chronic care and general practice, not the activities of the voluntary hospitals where the application of science resulted in expansion with accelerating velocity in every branch of medicine
- they were based on a false conception of health and disease. ‘Positive health’ was neither easily nor permanently achieved. The fight against disease was a continual struggle which was ever more difficult, promoting the survival of the unfit. We were cured of simpler and cheaper diseases to fall victim later on to the more complex and expensive.
Roberts saw medicine, like other commodities, as a core of essentials surrounded by inessentials extending to luxury and extravagances. The present rate of expenditure would lead to national ruin.
The alternative is hardly less comforting. It is that a limit will be set by shortage of personnel and materials. This means that medicine will be rationed and controlled, and there is no reason for supposing that nationalized medicine possesses any moral superiority rendering it immune from the vices which rationing and control invariably bring in their train. Medicine is not above economic law but strictly subject to it.
The NHS accounted for no mean percentage of the national budget, and money was also needed for education, transport, industrial equipment and defence. The Korean war had imposed an added burden on national finances. Efficiency and economy were therefore continuing concerns. Whereas in a service such as education the population was limited to those of school or university age, and the costs of teaching determined by the syllabus, there were no similar constraints on the NHS. Within a year Labour was on the defensive about the rising cost. The Conservatives were ‘shocked and alarmed’, saying that, although they too had planned a health service, a great bureaucracy was growing up and there was enormous and wasteful extravagance. The Minister had shown himself quite irresponsible in financial matters and heedless of the best interests of patients as well of the medical profession. He should go. Bevan replied that it was hard to know what the Conservatives were complaining about – was it the inaccurate estimates or spending the money at all? 276
Costs kept rising. The BMJ believed that, ignoring the British capacity for muddling through, the NHS was heading in the direction of bankruptcy.277 The illusion that they were getting something for nothing led people to seek free supplies of household remedies for which they had previously paid, such as aspirin, laxatives, first-aid dressings and cotton wool. Many were going round with two pairs of spectacles when one would have done. Charges would not offend against the concept of a comprehensive service without financial barriers.
The policy, based upon the decisions of the (wartime) Coalition Government, had been put into execution by a Minister who could not resist the temptation of behaving like a fairy godmother to an impoverished nation. The medical profession had welcomed the service in spite of doubts about the role of the state in the care of the sick . . . Now the honeymoon period was over; the relations between profession and state were strained because of shortage of money; and the NHS would have to undergo successive modifications in the next few years if it was not to fail. Perhaps the public saw the main benefit as not paying for medicine at the time of receiving it – and the public had run riot at the chemist’s shop.278
In 1950, the Chancellor, Hugh Gaitskell, forced the issue of charges. Labour passed legislation making it possible to charge for drugs, spectacles and dentures, but did not impose them. Bevan resigned in 1951, in part because of his opposition to charges, but mainly because he felt that government had failed to distribute the tax burden properly between different social classes, and military expenditure had been spared when social services were not.279 The BMA argued for hotel charges on admission to hospital in its evidence to the Select Committee of Estimates and, in May 1951, charges for dentures and spectacles were introduced. A ceiling was applied to expenditure on the health service. The Chancellor stated that, in 1952, the cost of the service would be kept within the same bounds.280
The rising cost of prescribing was soon seen as one of the great problems confronting the NHS.281 Costs rose about 45 per cent during the first five years of the service. In 1950 the CMO wrote to GPs to say that, while they had the right to prescribe whatever was necessary for an individual, unnecessary expenditure should be avoided, and that there were mechanisms to deal with excessive prescribing.282 In October 1951, Labour lost the general election and the Conservatives came to power. The following year a prescription charge of one shilling (5p) was introduced. The Ministry began to issue ‘Prescribers’ notes’ to GPs as an educative measure. In 1953 the Joint Committee on Prescribing suggested that preparations that were not in the British Pharmacopoeia, Pharmaceutical Codex or National Formulary, that had not been proved of therapeutic value, or that had dangerous side effects should not be prescribable under the NHS. Doctors were asked to check the costs of comparable drugs and review the frequency and quantities prescribed. Medical school deans were asked to teach students and young doctors about the cost of prescribing. The BMJ saw this as an attempt to deprive doctors of the responsibility of deciding whether, in a particular case, the benefits outweighed the dangers. These were clinical judgements, which had nothing to do with the economics of prescribing. The dangers of restriction, said the journal, were far greater than the dangers of liberty.283 By 1956, 228 million prescriptions cost £58 million.
Reviewing the NHS – the Guillebaud Committee
In May 1953, the Conservative government appointed a committee, chaired by Claude Guillebaud, a Cambridge economist, to review the present and prospective cost of the NHS, to suggest whether modifications in organisation might permit effective control and efficiency, and how a rising charge could be avoided.284 The Committee’s work proceeded at a leisurely pace, which was to the advantage of the NHS because, in the meantime, it was hard for the Treasury to insist on a major economy programme.
It was a review perhaps more fundamental than the Royal Commission on the NHS two decades later.285 The terms of reference allowed the Committee to go well beyond financial issues and that it proceeded to do. Richard Titmuss, a social scientist who had worked at the MRC Social Medicine unit at the Central Middlesex Hospital before moving to the London School of Economics, and Brian Abel-Smith, his assistant and an economist, provided the Committee with a detailed analysis of the costs.286 Starting with definition of ‘cost’ in actual prices and 1948/9 prices, and of ‘adequate service’ (the best service possible within the limits of resources), Guillebaud collected a wide range of evidence and considered the past, present and future of general practice, hospitals, local authority services and population demographics. The report represented a turning point in political thinking about how much should be spent on health care and how one should measure the expenditure.
Cost of the NHS (England and Wales), net actual and 1948/9 prices (£ millions) and as a percentage of gross national product (GNP)
1948/9 |
1949/50 |
1950/1 |
1951/2 |
1952/3 |
1953/4 |
|
Actual net cost |
327.8 |
371.6 |
390.5 |
402.1 |
416.9 |
430.3 |
GNP |
9,349 |
9,907 |
10,539 |
11,560 |
12,487 |
13,273 |
1948/9 prices |
327.8 |
369.8 |
388.3 |
374.1 |
370.6 |
380.8 |
Proportion of GNP |
3.51% |
3.75% |
3.71% |
3.48% |
3.34% |
3.24% |
Source: Report of the Guillebaud Committee.287
These figures were updated in 1961 and published in Hansard.
The increased cost, when adjusted for inflation, was less alarming than had been thought. Indeed, as a proportion of gross national product, costs were actually falling. Analysis showed the effect of higher levels of wages and prices, and the significant increase in staff costs, as establishments had been progressively increased. The figures for 1952/3 had to be adjusted for the Danckwerts award to GPs, which added £24 million to gross costs and included back-pay owing. The cost of the service, per head of the expanding population, had risen from £7.65 to £8.75. The report stated that, contrary to public opinion, the diversion of funds to the NHS had been relatively insignificant. Most of the rise in hospital expenditure had been from inflation, although there had been a rise in the volume of goods and services purchased. Most of the rise in local health authority costs was due to inflation. Net expenditure on executive council services fell, partly because of charges made to patients. There had been a rise in the cost of drugs, mainly antibiotics, and more prescriptions were being issued. The ways in which these costs could be controlled was considered, but a restricted list was rejected. Hospital boarding charges were rejected.
The Committee was concerned at the low level of capital expenditure, roughly £10 million per year compared with pre-war levels nearer £30 million. There could be no doubt about the inadequacy of hospital structure. The Hospital Surveys had estimated that 45 per cent of hospitals predated 1891 and 21 per cent 1861.288 A return to the pre-war level of spending was recommended. Guillebaud said that it was difficult to see how more money could usefully be spent on health promotion, and the approach to health centres should continue to be experimental. Noting the division of responsibility for maternity services, the result of history rather than logic, an early review was recommended, which was chaired by the Earl of Cranbrook. The care of the elderly also required more attention.
The report provided no basis for a government attack on NHS expenditure on the grounds of financial probity. However, accounting systems were improved and the Ministry maintained a year-on-year record of the changes in the cost of the NHS. Such figures were published at the end of the Report of the Royal Commission on the NHS (1979). Guillebaud examined organisational issues such as the integration of the tripartite health service and the relationships of teaching hospitals to regional boards. The transfer of local authority health services to regional boards, or vice versa, was not seen as practical politics, and no structural change in the organisation was recommended. The former permanent secretary of the Ministry of Health and a member of the committee, Sir John Maude, entered a note of reservation. He analysed past history and the current concerns that the medical profession had about the tripartite system, and came to the conclusion that:
a serious weakness of the present structure lies in the fact that the NHS is in three parts, is operated by three sets of bodies having no organic connection with each other and is financed by three methods, one of which differs radically from the other two . . . some regard it as a major flaw in the scheme, others as no more than a piece of administrative untidiness.
Maude thought it might be expedient at some future date to return to the earlier conception of a unified health service based on local government, but, to enable the transfer of the NHS as a whole, reorganisation of local authority administration and finance would probably be needed.289
The first review of the NHS had given it a clean bill of health. The Acton Society Trust agreed that the structure was basically sound.290 The Minister of Health, Mr Turton, hoped that everyone would note with satisfaction, but not with complacency, that the NHS record was one of real achievement, but additional money could not be committed because of the economic situation. So long as there had to be a limit on financial resources available, the Minister would not be able to do at once all the things that needed to be done. The government accepted the Committee’s conclusion that, though there were weaknesses, the structure was sound, any fundamental change would be premature, and the need for stability over a period of years was important.291
From now on it became impossible for governments to attack the NHS. Disagreements in future would be about means, not ends. However, the medical profession was not unanimous that all was well. The right-wing Fellowship for Freedom in Medicine published proposals for the reform of the NHS, advocating state-subsidised compulsory insurance, covering 90 per cent of the cost for those in a position to pay for it, and a free health service for all others.292 Free drugs should be limited to life-savers and at least some direct responsibility should be placed on patients for their health. The introduction of token charges would make them aware of the great benefits received.
The health service had many achievements to its credit.293 The Lancet believed that it was one of the biggest improvements in the life of the country since the war. Much had been done to better the conditions of medical care, especially in hospital, thanks to the hard and intelligent work of many people, professional and lay. However, NHS administration might be made more efficient and appropriate.294 In 1957, the BMA Council established its own Committee of Inquiry into the NHS, a successor to the BMA Medical Planning Commission of 1941/2 that had proposed or supported many concepts subsequently incorporated into the NHS.295 Doctors had accepted the principle of the service, but not all its features. Increasingly they cast themselves as its defenders, rather than its attackers.